977 resultados para Kahler metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Surprisingly expensive to compute wall distances are still used in a range of key turbulence and peripheral physics models. Potentially economical, accuracy improving differential equation based distance algorithms are considered. These involve elliptic Poisson and hyperbolic natured Eikonal equation approaches. Numerical issues relating to non-orthogonal curvilinear grid solution of the latter are addressed. Eikonal extension to a Hamilton-Jacobi (HJ) equation is discussed. Use of this extension to improve turbulence model accuracy and, along with the Eikonal, enhance Detached Eddy Simulation (DES) techniques is considered. Application of the distance approaches is studied for various geometries. These include a plane channel flow with a wire at the centre, a wing-flap system, a jet with co-flow and a supersonic double-delta configuration. Although less accurate than the Eikonal, Poisson method based flow solutions are extremely close to those using a search procedure. For a moving grid case the Poisson method is found especially efficient. Results show the Eikonal equation can be solved on highly stretched, non-orthogonal, curvilinear grids. A key accuracy aspect is that metrics must be upwinded in the propagating front direction. The HJ equation is found to have qualitative turbulence model improving properties. © 2003 by P. G. Tucker.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessment method for ecological condition of Xiangxi River system was studied by using 13 candidate metrics of epilithic diatom which can reflect conditions in pH, salinity, nitrogen uptake metabolism, oxygen requirements, saprobity, trophic state, morphological character and pollution tolerant capability etc. By one-way ANOVA, the metrics of relative abundance of acidobiontic algae (ACID), freshwater algae (FRESH), high oxygen requirement (HIGH-O), eutraphentic state (EUTRA) and mobile taxa ( MOBILE) were suitable for distinguishing sites in different conditions. Then, the river diatom index (RDI) composed of these five metrics was used to evaluate ecological condition of the river. The results showed that the healthiest sites were in the Guanmenshan Natural Reserve ( with the mean RDI of 79.73). The sites located in tributary of Jiuchong River also owned excellent state (mean RDI of 78.25). Mean RDI of another tributary - Gufu River and the main river were 70.85 and 68.45 respectively, and the unhealthiest tributary was Gaolan River (with mean RDI of 65.64). The mean RDI for all the 51 sites was 71.40. The competence of RDI was discussed with comparison of evaluation results of DAIpo and TDI, it can be concluded that multimetrics is more competent in assessment task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Two-lane, "microscopic" (vehicle-by-vehicle) simulations of motorway traffic are developed using existing models and validated using measured data from the M25 motorway. An energy consumption model is also built in, which takes the logged trajectories of simulated vehicles as drive-cycles. The simulations are used to investigate the effects on motorway congestion and fuel consumption if "longer and/or heavier vehicles" (LHVs) were to be permitted in the UK. Baseline scenarios are simulated with traffic composed of cars, light goods vehicles and standard heavy goods vehicles (HGVs). A proportion of conventional articulated HGVs is then replaced by a smaller number of LHVs carrying the same total payload mass and volume. Four LHV configurations are investigated: an 18.75 m, 46 t longer semi-trailer (LST); 25.25 m, 50 t and 60 t B-doubles and a 34 m, 82 t A-double. Metrics for congestion, freight fleet energy consumption and car energy consumption are defined for comparing the scenarios. Finally, variation of take-up level and LHV engine power for the LST and A-double are investigated. It is concluded that: (a) LHVs should reduce congestion particularly in dense traffic, however, a low mean proportion of freight traffic on UK roads and low take-up levels will limit this effect to be almost negligible; (b) LHVs can significantly improve the energy efficiency of freight fleets, giving up to a 23% reduction in fleet energy consumption at high take-up levels; (c) the small reduction in congestion caused by LHVs could improve the fuel consumption of other road users by up to 3% in dense traffic, however in free-flowing traffic an opposite effect occurs due to higher vehicle speeds and aerodynamic losses; and (d) underpowered LHVs have potential to generate severe congestion, however current manufacturers' recommendations appear suitable. © 2013 IMechE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Performance measurement and management (PMM) is a management and research paradox. On one hand, it provides management with many critical, useful, and needed functions. Yet, there is evidence that it can adversely affect performance. This paper attempts to resolve this paradox by focusing on the issue of "fit". That is, in today's dynamic and turbulent environment, changes in either the business environment or the business strategy can lead to the need for new or revised measures and metrics. Yet, if these measures and metrics are either not revised or incorrectly revised, then we can encounter situations where what the firm wants to achieve (as communicated by its strategy) and what the firm measures and rewards are not synchronised with each other (i.e., there is a lack of "fit"). This situation can adversely affect the ability of the firm to compete. The issue of fit is explored using a three phase Delphi approach. Initially intended to resolve this first paradox, the Delphi study identified another paradox - one in which the researchers found that in a dynamic environment, firms do revise their strategies, yet, often the PMM system is not changed. To resolve this second paradox, the paper proposes a new framework - one that shows that under certain conditions, the observed metrics "lag" is not only explainable but also desirable. The findings suggest a need to recast the accepted relationship between strategy and PMM system and the output included the Performance Alignment Matrix that had utility for managers. © 2013 .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

输入的主观性以及输入过多是妨碍软件成本估算模型实际应用效果的重要影响因素.针对以上问题,提出了一种基于度量工具的软件成本估算模型使用方法.该方法通过引入统计理论中的工具变量,将度量工具所采集的度量元数据自动转换为软件成本估算模型的输入.这一方面可以避免模型校准和估算过程中输入的主观性与不一致性,提高了估算结果的准确性与可靠性;另一方面能减少估算人员的手工操作,提高工作效率,增加了软件成本估算模型的可用性.结合具体实例说明了所提出方法的可行性与有效性.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

重构是软件系统不断演化的关键之一,也是一项复杂而又困难的活动.传统的定位重构代码方法依赖开发者的观察和主观意识,耗时耗力,尤其在重构代码较多时.因此,提出了一套自动化定位重构的方法.该方法利用基于面向对象软件度量指标获取代码特征信息,使用相关性检验查验特征信息数据,应用主成分分析压缩和解释特征信息,应用聚类分析分类相似代码段,迅速准确定位重构.一个简单的实例表明该方法是简单有效的,并且优于传统方法.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In two papers [Proc. SPIE 4471, 272-280 (2001) and Appl. Opt. 43, 2709-2721 (2004)], a logarithmic phase mask was proposed and proved to be effective in extending the depth of field; however, according to our research, this mask is not that perfect because the corresponding defocused modulation transfer function has large oscillations in the low-frequency region, even when the mask is optimized. So, in a previously published paper [Opt. Lett. 33, 1171-1173 (2008)], we proposed an improved logarithmic phase mask by making a small modification. The new mask can not only eliminate the drawbacks to a certain extent but can also be even less sensitive to focus errors according to Fisher information criteria. However, the performance comparison was carried out with the modified mask not being optimized, which was not reasonable. In this manuscript, we optimize the modified logarithmic phase mask first before analyzing its performance and more convincing results have been obtained based on the analysis of several frequently used metrics. (C) 2010 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Based on the RS and GIS methods, Siping city is selected as a study case with four remote sensing images in 25 years. Indices of urban morphology such as fractal dimension and compactness are employed to research the characteristics of urban expansion. Through digital processing and interpreting of the images, the process and characteristics of urban expansion are analysed using urban area change, fractal dimension and compactness. The results showed that there are three terms in this period. It expended fastest in the period of 1979~1991, and in the period of 1992~2001, the emphases on urban redevelopment made it expended slower. And this is in agreement with the Siping Statistical Yearbook. This indicates that the united of metrics of urban morphology and statistical data can be used to satisfactorily describe the process and characteristics of urban expansion. © 2008 IEEE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel competition dialysis assay was used to investigate the structural selectivity of a series of substituted 2-(2-naphthyl)quinoline compounds designed to target triplex DNA. The interaction of 14 compounds with 13 different nucleic acid sequences and structures was studied. A striking selectivity for the triplex structure poly dA:[poly dT](2) was found for the majority of compounds studied. Quantitative analysis of the competition dialysis binding data using newly developed metrics revealed that these compounds are among the most selective triplex-binding agents synthesized to date. A quantitative structure-affinity relationship (QSAR) was derived using triplex binding data for all 14 compounds used in these studies. The QSAR revealed that the primary favorable determinant of triplex binding free energy is the solvent accessible surface area. Triplex binding affinity is negatively correlated with compound electron affinity and the number of hydrogen bond donors. The QSAR provides guidelines for the design of improved triplex-binding agents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Similarity measurements between 3D objects and 2D images are useful for the tasks of object recognition and classification. We distinguish between two types of similarity metrics: metrics computed in image-space (image metrics) and metrics computed in transformation-space (transformation metrics). Existing methods typically use image and the nearest view of the object. Example for such a measure is the Euclidean distance between feature points in the image and corresponding points in the nearest view. (Computing this measure is equivalent to solving the exterior orientation calibration problem.) In this paper we introduce a different type of metrics: transformation metrics. These metrics penalize for the deformatoins applied to the object to produce the observed image. We present a transformation metric that optimally penalizes for "affine deformations" under weak-perspective. A closed-form solution, together with the nearest view according to this metric, are derived. The metric is shown to be equivalent to the Euclidean image metric, in the sense that they bound each other from both above and below. For Euclidean image metric we offier a sub-optimal closed-form solution and an iterative scheme to compute the exact solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security policies are increasingly being implemented by organisations. Policies are mapped to device configurations to enforce the policies. This is typically performed manually by network administrators. The development and management of these enforcement policies is a difficult and error prone task. This thesis describes the development and evaluation of an off-line firewall policy parser and validation tool. This provides the system administrator with a textual interface and the vendor specific low level languages they trust and are familiar with, but the support of an off-line compiler tool. The tool was created using the Microsoft C#.NET language, and the Microsoft Visual Studio Integrated Development Environment (IDE). This provided an object environment to create a flexible and extensible system, as well as simple Web and Windows prototyping facilities to create GUI front-end applications for testing and evaluation. A CLI was provided with the tool, for more experienced users, but it was also designed to be easily integrated into GUI based applications for non-expert users. The evaluation of the system was performed from a custom built GUI application, which can create test firewall rule sets containing synthetic rules, to supply a variety of experimental conditions, as well as record various performance metrics. The validation tool was created, based around a pragmatic outlook, with regard to the needs of the network administrator. The modularity of the design was important, due to the fast changing nature of the network device languages being processed. An object oriented approach was taken, for maximum changeability and extensibility, and a flexible tool was developed, due to the possible needs of different types users. System administrators desire, low level, CLI-based tools that they can trust, and use easily from scripting languages. Inexperienced users may prefer a more abstract, high level, GUI or Wizard that has an easier to learn process. Built around these ideas, the tool was implemented, and proved to be a usable, and complimentary addition to the many network policy-based systems currently available. The tool has a flexible design and contains comprehensive functionality. As opposed to some of the other tools which perform across multiple vendor languages, but do not implement a deep range of options for any of the languages. It compliments existing systems, such as policy compliance tools, and abstract policy analysis systems. Its validation algorithms were evaluated for both completeness, and performance. The tool was found to correctly process large firewall policies in just a few seconds. A framework for a policy-based management system, with which the tool would integrate, is also proposed. This is based around a vendor independent XML-based repository of device configurations, which could be used to bring together existing policy management and analysis systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identification of subject-specific traits extracted from patterns of brain activity still represents an important challenge. The need to detect distinctive brain features, which is relevant for biometric and brain computer interface systems, has been also emphasized in monitoring the effect of clinical treatments and in evaluating the progression of brain disorders. Graph theory and network science tools have revealed fundamental mechanisms of functional brain organization in resting-state M/EEG analysis. Nevertheless, it is still not clearly understood how several methodological aspects may bias the topology of the reconstructed functional networks. In this context, the literature shows inconsistency in the chosen length of the selected epochs, impeding a meaningful comparison between results from different studies. In this study we propose an approach which aims to investigate the existence of a distinctive functional core (sub-network) using an unbiased reconstruction of network topology. Brain signals from a public and freely available EEG dataset were analyzed using a phase synchronization based measure, minimum spanning tree and k-core decomposition. The analysis was performed for each classical brain rhythm separately. Furthermore, we aim to provide a network approach insensitive to the effects that epoch length has on functional connectivity (FC) and network reconstruction. Two different measures, the phase lag index (PLI) and the Amplitude Envelope Correlation (AEC), were applied to EEG resting-state recordings for a group of eighteen healthy volunteers. Weighted clustering coefficient (CCw), weighted characteristic path length (Lw) and minimum spanning tree (MST) parameters were computed to evaluate the network topology. The analysis was performed on both scalp and source-space data. Results about distinctive functional core, show highest classification rates from k-core decomposition in gamma (EER=0.130, AUC=0.943) and high beta (EER=0.172, AUC=0.905) frequency bands. Results from scalp analysis concerning the influence of epoch length, show a decrease in both mean PLI and AEC values with an increase in epoch length, with a tendency to stabilize at a length of 12 seconds for PLI and 6 seconds for AEC. Moreover, CCw and Lw show very similar behaviour, with metrics based on AEC more reliable in terms of stability. In general, MST parameters stabilize at short epoch lengths, particularly for MSTs based on PLI (1-6 seconds versus 4-8 seconds for AEC). At the source-level the results were even more reliable, with stability already at 1 second duration for PLI-based MSTs. Our results confirm that EEG analysis may represent an effective tool to identify subject-specific characteristics that may be of great impact for several bioengineering applications. Regarding epoch length, the present work suggests that both PLI and AEC depend on epoch length and that this has an impact on the reconstructed network topology, particularly at the scalp-level. Source-level MST topology is less sensitive to differences in epoch length, therefore enabling the comparison of brain network topology between different studies.