881 resultados para Automated segmentation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate calibration of a head mounted display (HMD) is essential both for research on the visual system and for realistic interaction with virtual objects. Yet, existing calibration methods are time consuming and depend on human judgements, making them error prone, and are often limited to optical see-through HMDs. Building on our existing approach to HMD calibration Gilson et al. (2008), we show here how it is possible to calibrate a non-see-through HMD. A camera is placed inside a HMD displaying an image of a regular grid, which is captured by the camera. The HMD is then removed and the camera, which remains fixed in position, is used to capture images of a tracked calibration object in multiple positions. The centroids of the markers on the calibration object are recovered and their locations re-expressed in relation to the HMD grid. This allows established camera calibration techniques to be used to recover estimates of the HMD display's intrinsic parameters (width, height, focal length) and extrinsic parameters (optic centre and orientation of the principal ray). We calibrated a HMD in this manner and report the magnitude of the errors between real image features and reprojected features. Our calibration method produces low reprojection errors without the need for error-prone human judgements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a rising demand for the quantitative performance evaluation of automated video surveillance. To advance research in this area, it is essential that comparisons in detection and tracking approaches may be drawn and improvements in existing methods can be measured. There are a number of challenges related to the proper evaluation of motion segmentation, tracking, event recognition, and other components of a video surveillance system that are unique to the video surveillance community. These include the volume of data that must be evaluated, the difficulty in obtaining ground truth data, the definition of appropriate metrics, and achieving meaningful comparison of diverse systems. This chapter provides descriptions of useful benchmark datasets and their availability to the computer vision community. It outlines some ground truth and evaluation techniques, and provides links to useful resources. It concludes by discussing the future direction for benchmark datasets and their associated processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper analyses developments in the growth and configuration of the institutional savings markets within the European Union. The paper discusses the changing socio-economic context in which investment services within the EU are being delivered. The is followed by an examination of drivers of market integration such as the growth and consolidation of the fund management industry, the demographic and fiscal pressures for reform of pensions markets and the process and effects of the deregulation of investment services markets. There is a review of outstanding sources of market segmentation. The projections for future growth in pensions are outlined and implications for real estate investment assessed. It is concluded that, although numerous imponderables render reliable quantitative projections problematic, growth and restructuring of the institutional savings market is likely to increase cross-border capital flows to real estate markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social Networking Sites have recently become a mainstream communications technology for many people around the world. Major IT vendors are releasing social software designed for use in a business/commercial context. These Enterprise 2.0 technologies have impressive collaboration and information sharing functionality, but so far they do not have any organizational network analysis (ONA) features that reveal any patterns of connectivity within business units. This paper shows the impact of organizational network analysis techniques and social networks on organizational performance, we also give an overview on current enterprise social software, and most importantly, we highlight how Enterprise 2.0 can help automate an organizational network analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proteome of Salmonella enterica serovar Typhimurium was characterized by 2-dimensional HPLC mass spectrometry to provide a platform for subsequent proteomic investigations of low level multiple antibiotic resistance (MAR). Bacteria (2.15 +/- 0.23 x 10(10) cfu; mean +/- s.d.) were harvested from liquid culture and proteins differentially fractionated, on the basis of solubility, into preparations representative of the cytosol, cell envelope and outer membrane proteins (OMPs). These preparations were digested by treatment with trypsin and peptides separated into fractions (n = 20) by strong cation exchange chromatography (SCX). Tryptic peptides in each SCX fraction were further separated by reversed-phase chromatography and detected by mass spectrometry. Peptides were assigned to proteins and consensus rank listings compiled using SEQUEST. A total of 816 +/- 11 individual proteins were identified which included 371 +/- 33, 565 +/- 15 and 262 +/- 5 from the cytosolic, cell envelope and OMP preparations, respectively. A significant correlation was observed (r(2) = 0.62 +/- 0.10; P < 0.0001) between consensus rank position for duplicate cell preparations and an average of 74 +/- 5% of proteins were common to both replicates. A total of 34 outer membrane proteins were detected, 20 of these from the OMP preparation. A range of proteins (n = 20) previously associated with the mar locus in E. coli were also found including the key MAR effectors AcrA, TolC and OmpF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports on an exploratory study of segmentation practices of organisations with a social media presence. It investigates whether traditional segmentation approaches are still relevant in this new socio-technical environment and identifies emerging practices. The study found that social media are particularly promising in terms of targeting influencers, enabling the cost-effective delivery of personalised messages and engaging with numerous customer segments in a differentiated way. However, some problems previously identified in the segmentation literature still occur in the social media environment, such as the technical challenge of integrating databases, the preference for pragmatic rather than complex solutions and the lack of relevant analytical skills. Overall, a gap has emerged between marketing theory and practice. While segmentation is far from obsolete in the age of the social customer, it needs to adapt to reflect the characteristics of the new media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two types of poleward moving plasma concentration enhancements (PMPCEs) were observed during a sequence of pulsed reconnection events, both in the morning convection cell: Type L (low density) was associated with a cusp flow channel and seems likely to have been produced by ionization associated with particle precipitation, while Type H (high density) appeared to originate from the segmentation of the tongue of ionization by the processes which produced the Type L events. As a result, the Type L and Type H PMPCEs were interspersed, producing a complex density structure which underlines the importance of cusp flow channels as a mechanism for segmenting and structuring electron density in the cusp and shows the necessity of differentiating between at least two classes of electron density patches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than thirty years ago, Wind's seminal review of research in market segmentation culminated with a research agenda for the subject area. In the intervening period, research has focused on the development of segmentation bases and models, segmentation research techniques and the identification of statistically sound solutions. Practical questions about implementation and the integration of segmentation into marketing strategy have received less attention, even though practitioners are known to struggle with the actual practice of segmentation. This special issue is motivated by this tension between theory and practice, which has shaped and continues to influence the research priorities for the field. Although many years may have elapsed since Wind's original research agenda, pressing questions about effectiveness and productivity apparently remain; namely: (i) concerns about the link between segmentation and performance, and its measurement; and (ii) the notion that productivity improvements arising from segmentation are only achievable if the segmentation process is effectively implemented. There were central themes to the call for papers for this special issue, which aims to develop our understanding of segmentation value, productivity and strategies, and managerial issues and implementation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The creation of a target market strategy is integral to developing an effective business strategy. The concept of market segmentation is often cited as pivotal to establishing a target market strategy, yet all too often business-to-business marketers utilise little more than trade sectors or product groups as the basis for their groupings of customers, rather than customers' characteristics and buying behaviour. The purpose of this paper is to offer a solution for managers, focusing on customer purchasing behaviour, which evolves from the organisation's existing criteria used for grouping its customers. Design/methodology/approach – One of the underlying reasons managers fail to embrace best practice market segmentation is their inability to manage the transition from how target markets in an organisation are currently described to how they might look when based on customer characteristics, needs, purchasing behaviour and decision-making. Any attempt to develop market segments should reflect the inability of organisations to ignore their existing customer group classification schemes and associated customer-facing operational practices, such as distribution channels and sales force allocations. Findings – A straightforward process has been derived and applied, enabling organisations to practice market segmentation in an evolutionary manner, facilitating the transition to customer-led target market segments. This process also ensures commitment from the managers responsible for implementing the eventual segmentation scheme. This paper outlines the six stages of this process and presents an illustrative example from the agrichemicals sector, supported by other cases. Research implications – The process presented in this paper for embarking on market segmentation focuses on customer purchasing behaviour rather than business sectors or product group classifications - which is true to the concept of market segmentation - but in a manner that participating managers find non-threatening. The resulting market segments have their basis in the organisation's existing customer classification schemes and are an iteration to which most managers readily buy-in. Originality/value – Despite the size of the market segmentation literature, very few papers offer step-by-step guidance for developing customer-focused market segments in business-to-business marketing. The analytical tool for assessing customer purchasing deployed in this paper originally was created to assist in marketing planning programmes, but has since proved its worth as the foundation for creating segmentation schemes in business marketing, as described in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite an extensive market segmentation literature, applied academic studies which bridge segmentation theory and practice remain a priority for researchers. The need for studies which examine the segmentation implementation barriers faced by organisations is particularly acute. We explore segmentation implementation through the eyes of a European utilities business, by following its progress through a major segmentation project. The study reveals the character and impact of implementation barriers occurring at different stages in the segmentation process. By classifying the barriers, we develop implementation "rules" for practitioners which are designed to minimise their occurrence and impact. We further contribute to the literature by developing a deeper understanding of the mechanisms through which these implementation rules can be applied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This technique paper describes a novel method for quantitatively and routinely identifying auroral breakup following substorm onset using the Time History of Events and Macroscale Interactions During Substorms (THEMIS) all-sky imagers (ASIs). Substorm onset is characterised by a brightening of the aurora that is followed by auroral poleward expansion and auroral breakup. This breakup can be identified by a sharp increase in the auroral intensity i(t) and the time derivative of auroral intensity i'(t). Utilising both i(t) and i'(t) we have developed an algorithm for identifying the time interval and spatial location of auroral breakup during the substorm expansion phase within the field of view of ASI data based solely on quantifiable characteristics of the optical auroral emissions. We compare the time interval determined by the algorithm to independently identified auroral onset times from three previously published studies. In each case the time interval determined by the algorithm is within error of the onset independently identified by the prior studies. We further show the utility of the algorithm by comparing the breakup intervals determined using the automated algorithm to an independent list of substorm onset times. We demonstrate that up to 50% of the breakup intervals characterised by the algorithm are within the uncertainty of the times identified in the independent list. The quantitative description and routine identification of an interval of auroral brightening during the substorm expansion phase provides a foundation for unbiased statistical analysis of the aurora to probe the physics of the auroral substorm as a new scientific tool for aiding the identification of the processes leading to auroral substorm onset.