885 resultados para fuzzy based evaluation method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Identification of clouds from satellite images is now a routine task. Observation of clouds from the ground, however, is still needed to acquire a complete description of cloud conditions. Among the standard meteorologicalvariables, solar radiation is the most affected by cloud cover. In this note, a method for using global and diffuse solar radiation data to classify sky conditions into several classes is suggested. A classical maximum-likelihood method is applied for clustering data. The method is applied to a series of four years of solar radiation data and human cloud observations at a site in Catalonia, Spain. With these data, the accuracy of the solar radiation method as compared with human observations is 45% when nine classes of sky conditions are to be distinguished, and it grows significantly to almost 60% when samples are classified in only five different classes. Most errors are explained by limitations in the database; therefore, further work is under way with a more suitable database

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The hydroalcoholic extracts prepared from standard leaves of Maytenus ilicifolia and commercial samples of espinheira-santa were evaluated qualitatively (fingerprinting) and quantitatively. In this paper, fingerprinting chromatogram coupled with Principal Component Analysis (PCA) is described for the metabolomic analysis of standard and commercial espinheira-santa samples. The epicatechin standard was used as an external standard for the development and validation of a quantitative method for the analysis in herbal medicines using a photo diode array detector. This method has been applied for quantification of epicatechin in commercialized herbal medicines sold as espinheira-santa in Brazil and in the standard sample of M. ilicifolia.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A study was made to evaluate the effect of a castor oil-based detergent on strawberry crops treated with different classes of pesticides, namely deltamethrin, folpet, tebuconazole, abamectin and mancozeb, in a controlled environment. Experimental crops of greenhouse strawberries were cultivated in five different ways with control groups using pesticides and castor oil-based detergent. The results showed that the group 2, which was treated with castor oil-based detergent, presented the lowest amount of pesticide residues and the highest quality of fruit produced.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The quantitative structure property relationship (QSPR) for the boiling point (Tb) of polychlorinated dibenzo-p-dioxins and polychlorinated dibenzofurans (PCDD/Fs) was investigated. The molecular distance-edge vector (MDEV) index was used as the structural descriptor. The quantitative relationship between the MDEV index and Tb was modeled by using multivariate linear regression (MLR) and artificial neural network (ANN), respectively. Leave-one-out cross validation and external validation were carried out to assess the prediction performance of the models developed. For the MLR method, the prediction root mean square relative error (RMSRE) of leave-one-out cross validation and external validation was 1.77 and 1.23, respectively. For the ANN method, the prediction RMSRE of leave-one-out cross validation and external validation was 1.65 and 1.16, respectively. A quantitative relationship between the MDEV index and Tb of PCDD/Fs was demonstrated. Both MLR and ANN are practicable for modeling this relationship. The MLR model and ANN model developed can be used to predict the Tb of PCDD/Fs. Thus, the Tb of each PCDD/F was predicted by the developed models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this study, dispersive liquid-liquid microextraction based on the solidification of floating organic droplets was used for the preconcentration and determination of thorium in the water samples. In this method, acetone and 1-undecanol were used as disperser and extraction solvents, respectively, and the ligand 1-(2-thenoyl)-3,3,3-trifluoracetone reagent (TTA) and Aliquat 336 was used as a chelating agent and an ion-paring reagent, for the extraction of thorium, respectively. Inductively coupled plasma-optical emission spectrometry was applied for the quantitation of the analyte after preconcentration. The effect of various factors, such as the extraction and disperser solvent, sample pH, concentration of TTA and concentration of aliquat336 were investigated. Under the optimum conditions, the calibration graph was linear within the thorium content range of 1.0-250 µg L-1 with a detection limit of 0.2 µg L-1. The method was also successfully applied for the determination of thorium in the different water samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Virtual screening is a central technique in drug discovery today. Millions of molecules can be tested in silico with the aim to only select the most promising and test them experimentally. The topic of this thesis is ligand-based virtual screening tools which take existing active molecules as starting point for finding new drug candidates. One goal of this thesis was to build a model that gives the probability that two molecules are biologically similar as function of one or more chemical similarity scores. Another important goal was to evaluate how well different ligand-based virtual screening tools are able to distinguish active molecules from inactives. One more criterion set for the virtual screening tools was their applicability in scaffold-hopping, i.e. finding new active chemotypes. In the first part of the work, a link was defined between the abstract chemical similarity score given by a screening tool and the probability that the two molecules are biologically similar. These results help to decide objectively which virtual screening hits to test experimentally. The work also resulted in a new type of data fusion method when using two or more tools. In the second part, five ligand-based virtual screening tools were evaluated and their performance was found to be generally poor. Three reasons for this were proposed: false negatives in the benchmark sets, active molecules that do not share the binding mode, and activity cliffs. In the third part of the study, a novel visualization and quantification method is presented for evaluation of the scaffold-hopping ability of virtual screening tools.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The age-old adage goes that nothing in this world lasts but change, and this generation has indeed seen changes that are unprecedented. Business managers do not have the luxury of going with the flow: they have to plan ahead, to think strategies that will meet the changing conditions, however stormy the weather seems to be. This demand raises the question of whether there is something a manager or planner can do to circumvent the eye of the storm in the future? Intuitively, one can either run on the risk of something happening without preparing, or one can try to prepare oneself. Preparing by planning for each eventuality and contingency would be impractical and prohibitively expensive, so one needs to develop foreknowledge, or foresight past the horizon of the present and the immediate future. The research mission in this study is to support strategic technology management by designing an effective and efficient scenario method to induce foresight to practicing managers. The design science framework guides this study in developing and evaluating the IDEAS method. The IDEAS method is an electronically mediated scenario method that is specifically designed to be an effective and accessible. The design is based on the state-of-the-art in scenario planning, and the product is a technology-based artifact to solve the foresight problem. This study demonstrates the utility, quality and efficacy of the artifact through a multi-method empirical evaluation study, first by experimental testing and secondly through two case studies. The construction of the artifact is rigorously documented as justification knowledge as well as the principles of form and function on the general level, and later through the description and evaluation of instantiations. This design contributes both to practice and foundation of the design. The IDEAS method contributes to the state-of-the-art in scenario planning by offering a light-weight and intuitive scenario method for resource constrained applications. Additionally, the study contributes to the foundations and methods of design by forging a clear design science framework which is followed rigorously. To summarize, the IDEAS method is offered for strategic technology management, with a confident belief that it will enable gaining foresight and aid the users to choose trajectories past the gales of creative destruction and off to a brighter future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION: Web-based e-learning is a teaching tool increasingly used in many medical schools and specialist fields, including ophthalmology. AIMS: this pilot study aimed to develop internet-based course-based clinical cases and to evaluate the effectiveness of this method within a graduate medical education group. METHODS: this was an interventional randomized study. First, a website was built using a distance learning platform. Sixteen first-year ophthalmology residents were then divided into two randomized groups: one experimental group, which was submitted to the intervention (use of the e-learning site) and another control group, which was not submitted to the intervention. The students answered a printed clinical case and their scores were compared. RESULTS: there was no statistically significant difference between the groups. CONCLUSION: We were able to successfully develop the e-learning site and the respective clinical cases. Despite the fact that there was no statistically significant difference between the access and the non access group, the study was a pioneer in our department, since a clinical case online program had never previously been developed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Investment decision-making on far-reaching innovation ideas is one of the key challenges practitioners and academics face in the field of innovation management. However, the management practices and theories strongly rely on evaluation systems that do not fit in well with this setting. These systems and practices normally cannot capture the value of future opportunities under high uncertainty because they ignore the firm’s potential for growth and flexibility. Real options theory and options-based methods have been offered as a solution to facilitate decision-making on highly uncertain investment objects. Much of the uncertainty inherent in these investment objects is attributable to unknown future events. In this setting, real options theory and methods have faced some challenges. First, the theory and its applications have largely been limited to market-priced real assets. Second, the options perspective has not proved as useful as anticipated because the tools it offers are perceived to be too complicated for managerial use. Third, there are challenges related to the type of uncertainty existing real options methods can handle: they are primarily limited to parametric uncertainty. Nevertheless, the theory is considered promising in the context of far-reaching and strategically important innovation ideas. The objective of this dissertation is to clarify the potential of options-based methodology in the identification of innovation opportunities. The constructive research approach gives new insights into the development potential of real options theory under non-parametric and closeto- radical uncertainty. The distinction between real options and strategic options is presented as an explanans for the discovered limitations of the theory. The findings offer managers a new means of assessing future innovation ideas based on the frameworks constructed during the course of the study.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The feasibility of using augmented block designs and spatial analysis methods for early stage selection in eucalyptus breeding programs was tested. A total of 113 half-sib progenies of Eucalyptus urophylla and eight clones were evaluated in an 11 x 11 triple lattice experiment at two locations: Posto da Mata (Bahia, Brazil) and São Mateus (Minas Gerais, Brazil). Four checks were randomly allocated within each block. Plots consisted of 15 m long rows containing 6 plants spaced 3 m apart. The girth at breast height (cm/plant) was evaluated at 19 and 26 months of age. Variance analyses were performed according to the following methods: lattice design, randomized complete block design, augmented block design, Papadakis method, moving means method, and check plots. Comparisons among different methods were based on the magnitude of experimental errors and precision of the estimates of genetic and phenotypic parameters. General results indicated that augmented block design is useful to evaluate progenies and clones in early selection in eucalyptus breeding programs using moderate and low selection intensities. However, this design is not suitable for estimating genetic and phenotypic parameters due to its low precision. Check plots, nearest neighbour, Papadakis (1937), and moving means methods were efficient in removing the heterogeneity within blocks. These efficiencies were compared to that in lattice analysis for estimation of genetic and phenotypic parameters.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In order to verify Point-Centered Quarter Method (PCQM) accuracy and efficiency, using different numbers of individuals by per sampled area, in 28 quarter points in an Araucaria forest, southern Paraná, Brazil. Three variations of the PCQM were used for comparison associated to the number of sampled individual trees: standard PCQM (SD-PCQM), with four sampled individuals by point (one in each quarter), second measured (VAR1-PCQM), with eight sampled individuals by point (two in each quarter), and third measuring (VAR2-PCQM), with 16 sampled individuals by points (four in each quarter). Thirty-one species of trees were recorded by the SD-PCQM method, 48 by VAR1-PCQM and 60 by VAR2-PCQM. The level of exhaustiveness of the vegetation census and diversity index showed an increasing number of individuals considered by quadrant, indicating that VAR2-PCQM was the most accurate and efficient method when compared with VAR1-PCQM and SD-PCQM.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The drug discovery process is facing new challenges in the evaluation process of the lead compounds as the number of new compounds synthesized is increasing. The potentiality of test compounds is most frequently assayed through the binding of the test compound to the target molecule or receptor, or measuring functional secondary effects caused by the test compound in the target model cells, tissues or organism. Modern homogeneous high-throughput-screening (HTS) assays for purified estrogen receptors (ER) utilize various luminescence based detection methods. Fluorescence polarization (FP) is a standard method for ER ligand binding assay. It was used to demonstrate the performance of two-photon excitation of fluorescence (TPFE) vs. the conventional one-photon excitation method. As result, the TPFE method showed improved dynamics and was found to be comparable with the conventional method. It also held potential for efficient miniaturization. Other luminescence based ER assays utilize energy transfer from a long-lifetime luminescent label e.g. lanthanide chelates (Eu, Tb) to a prompt luminescent label, the signal being read in a time-resolved mode. As an alternative to this method, a new single-label (Eu) time-resolved detection method was developed, based on the quenching of the label by a soluble quencher molecule when displaced from the receptor to the solution phase by an unlabeled competing ligand. The new method was paralleled with the standard FP method. It was shown to yield comparable results with the FP method and found to hold a significantly higher signal-tobackground ratio than FP. Cell-based functional assays for determining the extent of cell surface adhesion molecule (CAM) expression combined with microscopy analysis of the target molecules would provide improved information content, compared to an expression level assay alone. In this work, immune response was simulated by exposing endothelial cells to cytokine stimulation and the resulting increase in the level of adhesion molecule expression was analyzed on fixed cells by means of immunocytochemistry utilizing specific long-lifetime luminophore labeled antibodies against chosen adhesion molecules. Results showed that the method was capable of use in amulti-parametric assay for protein expression levels of several CAMs simultaneously, combined with analysis of the cellular localization of the chosen adhesion molecules through time-resolved luminescence microscopy inspection.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Aims: This study was carried out to investigate the usefulness of acoustic rhinometry in the evaluation of intranasal dimensions in children. The aim was to define reference values for school children. In addition, the role of the VAS scale in the subjective evaluation of nasal obstruction in children was studied. Materials and methods: Measurements were done with Acoustic Rhinometry A1. The values of special interest were the minimal cross-sectional area (MCA) and the anterior volume of the nose (VOL). The data for reference values included 124 voluntary school children with no permanent nasal symptoms, aged between 7 and 14 years. Data were collected at baseline and after decongestion of the nose; the VAS scale was filled in before measurements. The subjects in the follow-up study (n=74, age between 1 and 12 years) were receiving intranasal spray of insulin or placebo. The nasal symptoms were recorded and acoustic rhinometry was measured at each control visit. Results: In school children, the mean total MCA was 0.752 cm2 (SD 0.165), and the mean total VOL was 4.00 cm3 (SD 0.63) at baseline. After decongestion, a significant increase in the mean TMCA and in the mean TVOL was found. A correlation was found between TMCA and age, and between TVOL and height of a child. There was no difference between boys and girls. A correlation was found between unilateral acoustic values and VAS at baseline, but not after decongestion. No difference wasfound in acoustic values or symptoms between the insulin and placebo group in the follow-up study of two years. Conclusions: Acoustic rhinometry is a suitable objective method to examine intranasal dimensions in children. It is easy to perform and well tolerated. Reference values for children between 7 and 14 years were established.