248 resultados para Kähler-Einstein Metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, available literature mainly discusses Twitter activity patterns in the context of individual case studies, while comparative research on a large number of communicative events, their dynamics and patterns is missing. By conducting a comparative study of more than forty different cases (covering topics such as elections, natural disasters, corporate crises, and televised events) we identify a number of distinct types of discussion which can be observed on Twitter. Drawing on a range of communicative metrics, we show that thematic and contextual factors influence the usage of different communicative tools available to Twitter users, such as original tweets, @replies, retweets, and URLs. Based on this first analysis of the overall metrics of Twitter discussions, we also demonstrate stable patterns in the use of Twitter in the context of major topics and events.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graph theoretic approach is developed for accurately computing haulage costs in earthwork projects. This is vital as haulage is a predominant factor in the real cost of earthworks. A variety of metrics can be used in our approach, but a fuel consumption proxy is recommended. This approach is novel as it considers the constantly changing terrain that results from cutting and filling activities and replaces inaccurate “static” calculations that have been used previously. The approach is also capable of efficiently correcting the violation of top down cutting and bottom up filling conditions that can be found in existing earthwork assignments and sequences. This approach assumes that the project site is partitioned into uniform blocks. A directed graph is then utilised to describe the terrain surface. This digraph is altered after each cut and fill, in order to reflect the true state of the terrain. A shortest path algorithm is successively applied to calculate the cost of each haul and these costs are summed to provide a total cost of haulage

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Increasing global competition, rapid technological changes, advances in manufacturing and information technology and discerning customers are forcing supply chains to adopt improvement practices that enable them to deliver high quality products at a lower cost and in a shorter period of time. A lean initiative is one of the most effective approaches toward achieving this goal. In the lean improvement process, it is critical to measure current and desired performance level in order to clearly evaluate the lean implementation efforts. Many attempts have tried to measure supply chain performance incorporating both quantitative and qualitative measures but failed to provide an effective method of measuring improvements in performances for dynamic lean supply chain situations. Therefore, the necessity of appropriate measurement of lean supply chain performance has become imperative. There are many lean tools available for supply chains; however, effectiveness of a lean tool depends on the type of the product and supply chain. One tool may be highly effective for a supply chain involved in high volume products but may not be effective for low volume products. There is currently no systematic methodology available for selecting appropriate lean strategies based on the type of supply chain and market strategy This thesis develops an effective method to measure the performance of supply chain consisting of both quantitative and qualitative metrics and investigates the effects of product types and lean tool selection on the supply chain performance Supply chain performance matrices and the effects of various lean tools over performance metrics mentioned in the SCOR framework have been investigated. A lean supply chain model based on the SCOR metric framework is then developed where non- lean and lean as well as quantitative and qualitative metrics are incorporated in appropriate metrics. The values of appropriate metrics are converted into triangular fuzzy numbers using similarity rules and heuristic methods. Data have been collected from an apparel manufacturing company for multiple supply chain products and then a fuzzy based method is applied to measure the performance improvements in supply chains. Using the fuzzy TOPSIS method, which chooses an optimum alternative to maximise similarities with positive ideal solutions and to minimise similarities with negative ideal solutions, the performances of lean and non- lean supply chain situations for three different apparel products have been evaluated. To address the research questions related to effective performance evaluation method and the effects of lean tools over different types of supply chains; a conceptual framework and two hypotheses are investigated. Empirical results show that implementation of lean tools have significant effects over performance improvements in terms of time, quality and flexibility. Fuzzy TOPSIS based method developed is able to integrate multiple supply chain matrices onto a single performance measure while lean supply chain model incorporates qualitative and quantitative metrics. It can therefore effectively measure the improvements for supply chain after implementing lean tools. It is demonstrated that product types involved in the supply chain and ability to select right lean tools have significant effect on lean supply chain performance. Future study can conduct multiple case studies in different contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Daylight devices are important components of any climate responsive façade system. But, the evolution of parametric CAD systems and digital fabrication has had an impact on architectural form so that regular forms are shifting to complex geometries. Architectural and engineering integration of daylight devices in envelopes with complex geometries is a challenge in terms of design and performance evaluation. The purpose of this paper is to assess daylight performance of a building with a climatic responsive envelope with complex geometry that integrates shading devices in the façade. The case study is based on the Esplanade buildings in Singapore. Climate-based day-light metrics such as Daylight Availability and Useful Daylight Illuminance are used. DIVA (daylight simulation), and Grasshopper (parametric analysis) plug-ins for Rhinoceros have been employed to examine the range of performance possibilities. Parameters such as dimension, inclination of the device, projected shadows and shape have been changed in order to maximize daylight availability and Useful Daylight Illuminance while minimizing glare probability. While orientation did not have a great impact on the results, aperture of the shading devices did, showing that shading devices with a projection of 1.75 m to 2.00 m performed best, achieving target lighting levels without issues of glare.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a mini-review of the development and contemporary applications of diffusion-sensitive nuclear magnetic resonance (NMR) techniques in biomedical sciences. Molecular diffusion is a fundamental physical phenomenon present in all biological systems. Due to the connection between experimentally measured diffusion metrics and the microscopic environment sensed by the diffusing molecules, diffusion measurements can be used for characterisation of molecular size, molecular binding and association, and the morphology of biological tissues. The emergence of magnetic resonance was instrumental to the development of biomedical applications of diffusion. We discuss the fundamental physical principles of diffusion NMR spectroscopy and diffusion MR imaging. The emphasis is placed on conceptual understanding, historical evolution and practical applications rather than complex technical details. Mathematical description of diffusion is presented to the extent that it is required for the basic understanding of the concepts. We present a wide range of spectroscopic and imaging applications of diffusion magnetic resonance, including colloidal drug delivery vehicles; protein association; characterisation of cell morphology; neural fibre tractography; cardiac imaging; and the imaging of load-bearing connective tissues. This paper is intended as an accessible introduction into the exciting and growing field of diffusion magnetic resonance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

IEEE 802.11p is the new standard for Inter-Vehicular Communications (IVC) using the 5.9 GHz frequency band, as part of the DSRC framework; it will enable applications based on Cooperative Systems. Simulation is widely used to estimate or verify the potential benefits of such cooperative applications, notably in terms of safety for the drivers. We have developed a performance model for 802.11p that can be used by simulations of cooperative applications (e.g. collision avoidance) without requiring intricate models of the whole IVC stack. Instead, it provide a a straightforward yet realistic modelisation of IVC performance. Our model uses data from extensive field trials to infer the correlation between speed, distance and performance metrics such as maximum range, latency and frame loss. Then, we improve this model to limit the number of profiles that have to be generated when there are more than a few couples of emitter-receptor in a given location. Our model generates realistic performance for rural or suburban environments among small groups of IVC-equipped vehicles and road side units.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, there has been a significant increase in the popularity of ontological analysis of conceptual modelling techniques. To date, related research explores the ontological deficiencies of classical techniques such as ER or UML modelling, as well as business process modelling techniques such as ARIS or even Web Services standards such as BPEL4WS, BPML, ebXML, BPSS and WSCI. While the ontologies that form the basis of these analyses are reasonably mature, it is the actual process of an ontological analysis that still lacks rigour. The current procedure is prone to individual interpretations and is one reason for criticism of the entire ontological analysis. This paper presents a procedural model for ontological analysis based on the use of meta models, multiple coders and metrics. The model is supported by examples from various ontological analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Earthwork planning has been considered in this article and a generic block partitioning and modelling approach has been devised to provide strategic plans of various levels of detail. Conceptually this approach is more accurate and comprehensive than others, for instance those that are section based. In response to environmental concerns the metric for decision making was fuel consumption and emissions. Haulage distance and gradient are also included as they are important components of these metrics. Advantageously the fuel consumption metric is generic and captures the physical difficulties of travelling over inclines of different gradients, that is consistent across all hauling vehicles. For validation, the proposed models and techniques have been applied to a real world road project. The numerical investigations have demonstrated that the models can be solved with relatively little CPU time. The proposed block models also result in solutions of superior quality, i.e. they have reduced fuel consumption and cost. Furthermore the plans differ considerably from those based solely upon a distance based metric thus demonstrating a need for industry to reflect upon their current practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Novel computer vision techniques have been developed for automatic monitoring of crowed environments such as airports, railway stations and shopping malls. Using video feeds from multiple cameras, the techniques enable crowd counting, crowd flow monitoring, queue monitoring and abnormal event detection. The outcome of the research is useful for surveillance applications and for obtaining operational metrics to improve business efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a long-term experiment where a mobile robot uses adaptive spherical views to localize itself and navigate inside a non-stationary office environment. The office contains seven members of staff and experiences a continuous change in its appearance over time due to their daily activities. The experiment runs as an episodic navigation task in the office over a period of eight weeks. The spherical views are stored in the nodes of a pose graph and they are updated in response to the changes in the environment. The updating mechanism is inspired by the concepts of long- and short-term memories. The experimental evaluation is done using three performance metrics which evaluate the quality of both the adaptive spherical views and the navigation over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Quality of experience (QoE) measures the overall perceived quality of mobile video delivery from subjective user experience and objective system performance. Current QoE computing models have two main limitations: 1) insufficient consideration of the factors influencing QoE, and; 2) limited studies on QoE models for acceptability prediction. In this paper, a set of novel acceptability-based QoE models, denoted as A-QoE, is proposed based on the results of comprehensive user studies on subjective quality acceptance assessments. The models are able to predict users’ acceptability and pleasantness in various mobile video usage scenarios. Statistical regression analysis has been used to build the models with a group of influencing factors as independent predictors, including encoding parameters and bitrate, video content characteristics, and mobile device display resolution. The performance of the proposed A-QoE models has been compared with three well-known objective Video Quality Assessment metrics: PSNR, SSIM and VQM. The proposed A-QoE models have high prediction accuracy and usage flexibility. Future user-centred mobile video delivery systems can benefit from applying the proposed QoE-based management to optimize video coding and quality delivery decisions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Interdisciplinary research is often funded by national government initiatives or large corporate sponsorship, and as such, demands periodic reporting on the use of those funds. For reasons of accountability, governance and communication to the tax payer, knowledge of the outcomes of the research need to be measured and understood. The interdisciplinary approach to research raises many challenges for impact reporting. This presentation will consider what are the best practice workflow models and methodologies.Novel methodologies that can be added to the usual metrics of academic publications include analysis of percentage share of total publications in a subject or keyword field, calculating most cited publication in a key phrase category, analysis of who has cited or reviewed the work, and benchmarking of this data against others in that same category. At QUT, interest in how collaborative networking is trending in a research theme has led to the creation of some useful co-authorship graphs that demonstrate the network positions of authors and the strength of their scientific collaborations within a group. The scale of international collaborations is also worth including in the assessment. However, despite all of the tools and techniques available, the most useful way a researcher can help themselves and the process is to set up and maintain their researcher identifier and profile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection and correction of defects remains among the most time consuming and expensive aspects of software development. Extensive automated testing and code inspections may mitigate their effect, but some code fragments are necessarily more likely to be faulty than others, and automated identification of fault prone modules helps to focus testing and inspections, thus limiting wasted effort and potentially improving detection rates. However, software metrics data is often extremely noisy, with enormous imbalances in the size of the positive and negative classes. In this work, we present a new approach to predictive modelling of fault proneness in software modules, introducing a new feature representation to overcome some of these issues. This rank sum representation offers improved or at worst comparable performance to earlier approaches for standard data sets, and readily allows the user to choose an appropriate trade-off between precision and recall to optimise inspection effort to suit different testing environments. The method is evaluated using the NASA Metrics Data Program (MDP) data sets, and performance is compared with existing studies based on the Support Vector Machine (SVM) and Naïve Bayes (NB) Classifiers, and with our own comprehensive evaluation of these methods.