939 resultados para data-driven simulation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The conformance of semantic technologies has to be systematically evaluated to measure and verify the real adherence of these technologies to the Semantic Web standards. Currente valuations of semantic technology conformance are not exhaustive enough and do not directly cover user requirements and use scenarios, which raises the need for a simple, extensible and parameterizable method to generate test data for such evaluations. To address this need, this paper presents a keyword-driven approach for generating ontology language conformance test data that can be used to evaluate semantic technologies, details the definition of a test suite for evaluating OWL DL conformance using this approach,and describes the use and extension of this test suite during the evaluation of some tools.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The ECHAM-1 T21/LSG coupled ocean-atmosphere general circulation model (GCM) is used to simulate climatic conditions at the last interglacial maximum (Eemian. 125 kyr BP). The results reflect thc expected surface temperature changes (with respect to the control run) due to the amplification (reduction) of the seasonal cycle of insolation in the Northern (Southern) Hemisphere. A number of simulated features agree with previous results from atmospheric GCM simulations e.g. intensified summer southwest monsoons) except in the Northern Hemisphere poleward of 30 degrees N. where dynamical feedback, in the North Atlantic and North Pacific increase zonal temperatures about 1 degrees C above what would be predicted from simple energy balance considerations. As this is the same area where most of the terrestrial geological data originate, this result suggests that previous estimates of Eemian global average temperature might have been biased by sample distribution. This conclusion is supported by the fact that the estimated global temperature increase of only 0.3 degrees C greater than the control run ha, been previously shown to be consistent a with CLIMAP sea surface temperature estimates. Although the Northern Hemisphere summer monsoon is intensified. globally averaged precipitation over land is within about 1% of the present, contravening some geological inferences bur not the deep-sea delta(13)C estimates of terrestrial carbon storage changes. Winter circulation changes in the northern Arabian Sea. driven by strong cooling on land, are as large as summer circulation changes that are the usual focus of interest, suggesting that interpreting variations in the Arabian Sea. sedimentary record solely in terms of the summer monsoon response could sometimes lead to errors. A small monsoonal response over northern South America suggests that interglacial paleotrends in this region were not just due to El Nino variations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Nowadays, data mining is based on low-level specications of the employed techniques typically bounded to a specic analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Here, we propose a model-driven approach based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (via data-warehousing technology) and the analysis models for data mining (tailored to a specic platform). Thus, analysts can concentrate on the analysis problem via conceptual data-mining models instead of low-level programming tasks related to the underlying-platform technical details. These tasks are now entrusted to the model-transformations scaffolding.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Washington, D.C.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A systematic goal-driven top-down modelling methodology is proposed that is capable of developing a multiscale model of a process system for given diagnostic purposes. The diagnostic goal-set and the symptoms are extracted from HAZOP analysis results, where the possible actions to be performed in a fault situation are also described. The multiscale dynamic model is realized in the form of a hierarchical coloured Petri net by using a novel substitution place-transition pair. Multiscale simulation that focuses automatically on the fault areas is used to predict the effect of the proposed preventive actions. The notions and procedures are illustrated on some simple case studies including a heat exchanger network and a more complex wet granulation process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We model nongraphitized carbon black surfaces and investigate adsorption of argon on these surfaces by using the grand canonical Monte Carlo simulation. In this model, the nongraphitized surface is modeled as a stack of graphene layers with some carbon atoms of the top graphene layer being randomly removed. The percentage of the surface carbon atoms being removed and the effective size of the defect ( created by the removal) are the key parameters to characterize the nongraphitized surface. The patterns of adsorption isotherm and isosteric heat are particularly studied, as a function of these surface parameters as well as pressure and temperature. It is shown that the adsorption isotherm shows a steplike behavior on a perfect graphite surface and becomes smoother on nongraphitized surfaces. Regarding the isosteric heat versus loading, we observe for the case of graphitized thermal carbon black the increase of heat in the submonolayer coverage and then a sharp decline in the heat when the second layer is starting to form, beyond which it increases slightly. On the other hand, the isosteric heat versus loading for a highly nongraphitized surface shows a general decline with respect to loading, which is due to the energetic heterogeneity of the surface. It is only when the fluid-fluid interaction is greater than the surface energetic factor that we see a minimum-maximum in the isosteric heat versus loading. These simulation results of isosteric heat agree well with the experimental results of graphitization of Spheron 6 (Polley, M. H.; Schaeffer, W. D.; Smith, W. R. J. Phys. Chem. 1953, 57, 469; Beebe, R. A.; Young, D. M. J. Phys. Chem. 1954, 58, 93). Adsorption isotherms and isosteric heat in pores whose walls have defects are also studied from the simulation, and the pattern of isotherm and isosteric heat could be used to identify the fingerprint of the surface.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Numerical simulations of turbulent driven flow in a dense medium cyclone with magnetite medium have been conducted using Fluent. The predicted air core shape and diameter were found to be close to the experimental results measured by gamma ray tomography. It is possible that the Large eddy simulation (LES) turbulence model with Mixture multi-phase model can be used to predict the air/slurry interface accurately although the LES may need a finer grid. Multi-phase simulations (air/water/medium) are showing appropriate medium segregation effects but are over-predicting the level of segregation compared to that measured by gamma-ray tomography in particular with over prediction of medium concentrations near the wall. Further, investigated the accurate prediction of axial segregation of magnetite using the LES turbulence model together with the multi-phase mixture model and viscosity corrections according to the feed particle loading factor. Addition of lift forces and viscosity correction improved the predictions especially near the wall. Predicted density profiles are very close to gamma ray tomography data showing a clear density drop near the wall. The effect of size distribution of the magnetite has been fully studied. It is interesting to note that the ultra-fine magnetite sizes (i.e. 2 and 7 mu m) are distributed uniformly throughout the cyclone. As the size of magnetite increases, more segregation of magnetite occurs close to the wall. The cut-density (d(50)) of the magnetite segregation is 32 gm, which is expected with superfine magnetite feed size distribution. At higher feed densities the agreement between the [Dungilson, 1999; Wood, J.C., 1990. A performance model for coal-washing dense medium cyclones, Ph.D. Thesis, JKMRC, University of Queensland] correlations and the CFD are reasonably good, but the overflow density is lower than the model predictions. It is believed that the excessive underflow volumetric flow rates are responsible for under prediction of the overflow density. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fuzzy data has grown to be an important factor in data mining. Whenever uncertainty exists, simulation can be used as a model. Simulation is very flexible, although it can involve significant levels of computation. This article discusses fuzzy decision-making using the grey related analysis method. Fuzzy models are expected to better reflect decision-making uncertainty, at some cost in accuracy relative to crisp models. Monte Carlo simulation is used to incorporate experimental levels of uncertainty into the data and to measure the impact of fuzzy decision tree models using categorical data. Results are compared with decision tree models based on crisp continuous data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Co-III complexes of the hexadentate tripodal ligands HOsen (3-(2'-aminoethylamino)-2,2-bis((2 ''-aminoethylamino) methyl) propan-1-ol) and HOten (3-(2'-aminoethylthia)-2,2-bis((2 ''-aminoethylthia) methyl) propan-1-ol) have been synthesized and fully characterized. The crystal structures of [Co(HOsen)]Cl-3 center dot H2O and [Co(HOten)](ClO4)Cl-2 are reported and in both cases the ligands coordinate as tripodal hexadentate N-6 and N3S3 donors, respectively. Cyclic voltammetry of the N3S3 coordinated complex [Co(HOten)](3+) is complicated and electrode dependent. On a Pt working electrode an irreversible Co-III/II couple ( formal potential - 157 mV versus Ag-AgCl) is seen, which is indicative of dissociation of the divalent complex formed at the electrode. The free HOten released by the dissociation of [Co(HOten)](2+) can be recaptured by Hg as shown by cyclic voltammetry experiments on a static Hg drop electrode ( or in the presence of Hg2+ ions), which leads to the formation of an electroactive Hg-II complex of the N3S3 ligand (formal potential + 60 mV versus Ag-AgCl). This behaviour is in contrast to the facile and totally reversible voltammetry of the hexaamine complex [Co(HOsen)](3+) ( formal potential (Co-III/II) - 519 mV versus Ag-AgCl), which is uncomplicated by any coupled chemical reactions. Akinetic and thermodynamic analysis of the [Co(HOten)](2+)/[Hg(HOten)](2+) system is presented on the basis of digital simulation of the experimental voltammetric data.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In multilevel analyses, problems may arise when using Likert-type scales at the lowest level of analysis. Specifically, increases in variance should lead to greater censoring for the groups whose true scores fall at either end of the distribution. The current study used simulation methods to examine the influence of single-item Likert-type scale usage on ICC(1), ICC(2), and group-level correlations. Results revealed substantial underestimation of ICC(1) when using Likert-type scales with common response formats (e.g., 5 points). ICC(2) and group-level correlations were also underestimated, but to a lesser extent. Finally, the magnitude of underestimation was driven in large part to an interaction between Likert-type scale usage and the amounts of within- and between-group variance. © Sage Publications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is generally believed that the structural reforms that were introduced in India following the macro-economic crisis of 1991 ushered in competition and forced companies to become more efficient. However, whether the post-1991 growth is an outcome of more efficient use of resources or greater use of factor inputs remains an open empirical question. In this paper, we use plant-level data from 1989–1990 and 2000–2001 to address this question. Our results indicate that while there was an increase in the productivity of factor inputs during the 1990s, most of the growth in value added is explained by growth in the use of factor inputs. We also find that median technical efficiency declined in all but one of the industries between 1989–1990 and 2000–2001, and that change in technical efficiency explains a very small proportion of the change in gross value added.