31 resultados para Misalignment


Relevância:

10.00% 10.00%

Publicador:

Resumo:

For facial expression recognition systems to be applicable in the real world, they need to be able to detect and track a previously unseen person's face and its facial movements accurately in realistic environments. A highly plausible solution involves performing a "dense" form of alignment, where 60-70 fiducial facial points are tracked with high accuracy. The problem is that, in practice, this type of dense alignment had so far been impossible to achieve in a generic sense, mainly due to poor reliability and robustness. Instead, many expression detection methods have opted for a "coarse" form of face alignment, followed by an application of a biologically inspired appearance descriptor such as the histogram of oriented gradients or Gabor magnitudes. Encouragingly, recent advances to a number of dense alignment algorithms have demonstrated both high reliability and accuracy for unseen subjects [e.g., constrained local models (CLMs)]. This begs the question: Aside from countering against illumination variation, what do these appearance descriptors do that standard pixel representations do not? In this paper, we show that, when close to perfect alignment is obtained, there is no real benefit in employing these different appearance-based representations (under consistent illumination conditions). In fact, when misalignment does occur, we show that these appearance descriptors do work well by encoding robustness to alignment error. For this work, we compared two popular methods for dense alignment-subject-dependent active appearance models versus subject-independent CLMs-on the task of action-unit detection. These comparisons were conducted through a battery of experiments across various publicly available data sets (i.e., CK+, Pain, M3, and GEMEP-FERA). We also report our performance in the recent 2011 Facial Expression Recognition and Analysis Challenge for the subject-independent task.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Organizations adopt a Supply Chain Management System (SCMS) expecting benefits to the organization and its functions. However, organizations are facing mounting challenges to realizing benefits through SCMS. Studies suggest a growing dissatisfaction among client organizations due to an increasing gap between expectations and realization of SCMS benefits. Further, reflecting the Enterprise System studies such as Seddon et al. (2010), SCMS benefits are also expected to flow to the organization throughout its lifecycle rather than being realized all at once. This research therefore proposes to derive a lifecycle-wide understanding of SCMS benefits and realization to derive a benefit expectation management framework to attain the full potential of an SCMS. The primary research question of this study is: How can client organizations better manage their benefit expectations of SCM systems? The specific research goals of the current study include: (1) to better understand the misalignment of received and expected benefits of SCM systems; (2) to identify the key factors influencing SCM system expectations and to develop a framework to manage SCMS benefits; (3) to explore how organizational satisfaction is influenced by the lack of SCMS benefit confirmation; and (4) to explore how to improve the realization of SCM system benefits. Expectation-Confirmation Theory (ECT) provides the theoretical underpinning for this study. ECT has been widely used in the consumer behavior literature to study customer satisfaction, post-purchase behavior and service marketing in general. Recently, ECT has been extended into Information Systems (IS) research focusing on individual user satisfaction and IS continuance. However, only a handful of studies have employed ECT to study organizational satisfaction on large-scale IS. The current study will enrich the research stream by extending ECT into organizational-level analysis and verifying the preliminary findings of relevant works by Staples et al. (2002), Nevo and Chan (2007) and Nevo and Wade (2007). Moreover, this study will go further trying to operationalize the constructs of ECT into the context of SCMS. The empirical findings of the study commence with a content analysis, through which 41 vendor reports and academic reports are analyzed yielding sixty expected benefits of SCMS. Then, the expected benefits are compared with the benefits realized at a case organization in the Fast Moving Consumer Goods industry sector that had implemented a SAP Supply Chain Management System seven years earlier. The study develops an SCMS Benefit Expectation Management (SCMS-BEM) Framework. The comparison of benefit expectations and confirmations highlights that, while certain benefits are realized earlier in the lifecycle, other benefits could take almost a decade to realize. Further analysis and discussion on how the developed SCMS-BEM Framework influences ECT when applied in SCMS was also conducted. It is recommended that when establishing their expectations of the SCMS, clients should remember that confirmation of these expectations will have a long lifecycle, as shown in the different time periods in the SCMS-BEM Framework. Moreover, the SCMS-BEM Framework will allow organizations to maintain high levels of satisfaction through careful mitigation and confirming expectations based on the lifecycle phase. In addition, the study reveals that different stakeholder groups have different expectations of the same SCMS. The perspective of multiple stakeholders has significant implications for the application of ECT in the SCMS context. When forming expectations of the SCMS, the collection of organizational benefits of SCMS should represent the perceptions of all stakeholder groups. The same mechanism should be employed in the measurements of received SCMS benefits. Moreover, for SCMS, there exists interdependence of the satisfaction among the various stakeholders. The satisfaction of decision-makers or the authorized staff is not only driven by their own expectation confirmation level, it is also influenced by the confirmation level of other stakeholders‘ expectations in the organization. Satisfaction from any one particular stakeholder group can not reflect the true satisfaction of the client organization. Furthermore, it is inferred from the SCMS-BEM Framework that organizations should place emphasis on the viewpoints of the operational and management staff when evaluating the benefits of SCMS in the short and middle term. At the same time, organizations should be placing more attention on the perspectives of strategic staff when evaluating the performance of the SCMS in the long term.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many methods exist at the moment for deformable face fitting. A drawback to nearly all these approaches is that they are (i) noisy in terms of landmark positions, and (ii) the noise is biased across frames (i.e. the misalignment is toward common directions across all frames). In this paper we propose a grouped $\mathcal{L}1$-norm anchored method for simultaneously aligning an ensemble of deformable face images stemming from the same subject, given noisy heterogeneous landmark estimates. Impressive alignment performance improvement and refinement is obtained using very weak initialization as "anchors".

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Different types of HTS joints of Bi-2212/Ag tapes and laminates, which are fabricated by dip-coating and partial-melt processes, have been investigated. All joints are prepared using green single and laminated tapes and according to the scheme: coating-joining-processing. The heat treated tapes have critical current (Ic) between 7 and 27 A, depending on tape thickness and the number of Bi-2212 ceramic layers in laminated tapes. It is found that the current transport properties of joints depend on the type of laminate, joint configuration and joint treatment, Ic losses in joints of Bi-2212 tapes and laminates are attributed to defects in their structure, such as pores, secondary phases and misalignment of Bi-2212 grains near the Ag edges. By optimizing joint configuration, current transmission up to 100% is achieved for both single tapes and laminated tapes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the field of face recognition, Sparse Representation (SR) has received considerable attention during the past few years. Most of the relevant literature focuses on holistic descriptors in closed-set identification applications. The underlying assumption in SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such assumption is easily violated in the more challenging face verification scenario, where an algorithm is required to determine if two faces (where one or both have not been seen before) belong to the same person. In this paper, we first discuss why previous attempts with SR might not be applicable to verification problems. We then propose an alternative approach to face verification via SR. Specifically, we propose to use explicit SR encoding on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which are then concatenated to form an overall face descriptor. Due to the deliberate loss spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment & various image deformations. Within the proposed framework, we evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN), and an implicit probabilistic technique based on Gaussian Mixture Models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the proposed local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, in both verification and closed-set identification problems. The experiments also show that l1-minimisation based encoding has a considerably higher computational than the other techniques, but leads to higher recognition rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an attempt to deal with the potential problems presented by existing information systems, a shift towards the implementation of ERP packages has been witnessed. The common view, particularly the one espoused by vendors, is that ERP packages are most successfully implemented when the standard model is adopted. Yet, despite this, customisation activity still occurs reportedly due to misalignment between the functionality of the package and the requirements of those in the implementing organisation. However, it is recognised that systems development and organisational decision-making are activities influenced by the perspectives of the various groups and individuals involved in the process. Thus, as customisation can be seen as part of systems development, and has to be decided upon, it should be thought about in the same way. In this study, two ERP projects are used to examine different reasons why customisation might take place. These reasons are then built upon through reference to the ERP and more general packaged software literature. The study suggests that whilst a common reason for customising ERP packages might be concerned with functionality misfits, it is important to look further into why these may occur, as there are clearly other reasons for customisation stemming from the multiplicity of social groups involved in the process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Parabolic trough concentrator collector is the most matured, proven and widespread technology for the exploitation of the solar energy on a large scale for middle temperature applications. The assessment of the opportunities and the possibilities of the collector system are relied on its optical performance. A reliable Monte Carlo ray tracing model of a parabolic trough collector is developed by using Zemax software. The optical performance of an ideal collector depends on the solar spectral distribution and the sunshape, and the spectral selectivity of the associated components. Therefore, each step of the model, including the spectral distribution of the solar energy, trough reflectance, glazing anti-reflection coating and the absorber selective coating is explained and verified. Radiation flux distribution around the receiver, and the optical efficiency are two basic aspects of optical simulation are calculated using the model, and verified with widely accepted analytical profile and measured values respectively. Reasonably very good agreement is obtained. Further investigations are carried out to analyse the characteristics of radiation distribution around the receiver tube at different insolation, envelop conditions, and selective coating on the receiver; and the impact of scattered light from the receiver surface on the efficiency. However, the model has the capability to analyse the optical performance at variable sunshape, tracking error, collector imperfections including absorber misalignment with focal line and de-focal effect of the absorber, different rim angles, and geometric concentrations. The current optical model can play a significant role in understanding the optical aspects of a trough collector, and can be employed to extract useful information on the optical performance. In the long run, this optical model will pave the way for the construction of low cost standalone photovoltaic and thermal hybrid collector in Australia for small scale domestic hot water and electricity production.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The consistency of measuring small field output factors is greatly increased by reporting the measured dosimetric field size of each factor, as opposed to simply stating the nominal field size [1] and therefore requires the measurement of cross-axis profiles in a water tank. However, this makes output factor measurements time consuming. This project establishes at which field size the accuracy of output factors are not affected by the use of potentially inaccurate nominal field sizes, which we believe establishes a practical working definition of a ‘small’ field. The physical components of the radiation beam that contribute to the rapid change in output factor at small field sizes are examined in detail. The physical interaction that dominates the cause of the rapid dose reduction is quantified, and leads to the establishment of a theoretical definition of a ‘small’ field. Methods Current recommendations suggest that radiation collimation systems and isocentre defining lasers should both be calibrated to permit a maximum positioning uncertainty of 1 mm [2]. The proposed practical definition for small field sizes is as follows: if the output factor changes by ±1.0 % given a change in either field size or detector position of up to ±1 mm then the field should be considered small. Monte Carlo modelling was used to simulate output factors of a 6 MV photon beam for square fields with side lengths from 4.0 to 20.0 mm in 1.0 mm increments. The dose was scored to a 0.5 mm wide and 2.0 mm deep cylindrical volume of water within a cubic water phantom, at a depth of 5 cm and SSD of 95 cm. The maximum difference due to a collimator error of ±1 mm was found by comparing the output factors of adjacent field sizes. The output factor simulations were repeated 1 mm off-axis to quantify the effect of detector misalignment. Further simulations separated the total output factor into collimator scatter factor and phantom scatter factor. The collimator scatter factor was further separated into primary source occlusion effects and ‘traditional’ effects (a combination of flattening filter and jaw scatter etc.). The phantom scatter was separated in photon scatter and electronic disequilibrium. Each of these factors was plotted as a function of field size in order to quantify how each affected the change in small field size. Results The use of our practical definition resulted in field sizes of 15 mm or less being characterised as ‘small’. The change in field size had a greater effect than that of detector misalignment. For field sizes of 12 mm or less, electronic disequilibrium was found to cause the largest change in dose to the central axis (d = 5 cm). Source occlusion also caused a large change in output factor for field sizes less than 8 mm. Discussion and conclusions The measurement of cross-axis profiles are only required for output factor measurements for field sizes of 15 mm or less (for a 6 MV beam on Varian iX linear accelerator). This is expected to be dependent on linear accelerator spot size and photon energy. While some electronic disequilibrium was shown to occur at field sizes as large as 30 mm (the ‘traditional’ definition of small field [3]), it has been shown that it does not cause a greater change than photon scatter until a field size of 12 mm, at which point it becomes by far the most dominant effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature around Library 2.0 remains largely theoretical with few empirically studies and is particularly limited in developing countries such as Indonesia. This study addresses this gap and aims to provide information about the current state of knowledge on Indonesian LIS professionals’ understanding of Library 2.0. The researchers used qualitative and quantitative approaches for this study, asking thirteen closed- and open-ended questions in an online survey. The researchers used descriptive and in vivo coding to analyze the responses. Through their analysis, they identified three themes: technology, interactivity, and awareness of Library 2.0. Respondents demonstrated awareness of Library 2.0 and a basic understanding of the roles of interactivity and technology in libraries. However, overreliance on technology used in libraries to conceptualize Library 2.0 without an emphasis on its core characteristics and principles could lead to the misalignment of limited resources. The study results will potentially strengthen the research base for Library 2.0 practice, as well as inform LIS curriculum in Indonesia so as to develop practitioners who are able to adapt to users’ changing needs and expectations. It is expected that the preliminary data of this study could be used to design a much larger and more complex future research project in this area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Legacy information systems evolved incrementally in response to changes in business strategy and information technology. Organizations are now being forced to change much more radically and quickly than previously and this change places new demands on information systems. Legacy information systems are usually considered from a technical perspective, addressing issues such as age, complexity, maintainability, design and technology. We wish to demonstrate that the business dimension to legacy information systems, represented by the organisation structure, business processes and procedures that are bound up in the design and operation of the existing IT systems, is also significant. This paper identifies the important role of legacy information systems in the formation of new strategies. We show that the move away from a stable to an unstable business environment accelerates the rate of change. Furthermore, the gap between what the legacy information systems can deliver and the strategic vision of the organization widens when the legacy information systems are unable to adapt to meet the new requirements. An analysis of fifteen case studies provides evidence that legacy information systems include business and technical dimensions and that the systems can present problems when there is a misalignment between the strategic vision of the business, the IT legacy and the old business model embodied in the legacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, research aimed at identifying and relating the antecedents and consequences of diffusing organizational practices/ideas has turned its attention to debating the international adoption and implementation of the Anglo-American model of corporate governance, i.e., a shareholder-value-orientation (SVO). While financial economists characterize the adoption of an SVO as necessary and performance-enhancing, behavioral scientists have disputed such claims, invoking institutional contingencies in the appropriateness of an SVO. Our study seeks to provide some resolution to the debate by developing an overarching socio-political perspective that links the antecedents and consequences of the adoption of the contested practice of SVO. We test our framework using extensive longitudinal data from 1992-2006 from the largest listed corporations in the Netherlands, and we find a negative relationship between SVO adoption and subsequent firm performance, although this effect is attenuated when accompanied by greater SVO-alignment among major owners and a firm’s visible commitment to an SVO. This study extends prior research on the diffusion of contested organizational practices that has taken a socio-political perspective by offering an original contingency perspective that addresses how and why the misaligned preferences of corporate owners will affect (i) a company’s inclination to espouse an SVO, and (ii) the performance consequences of such misalignment.This study suggests when board members are considering the adoption of new ideas/practices (e.g., SVO), they should consider the contextual fitness of the idea/practice with the firm’s owners and their interests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Achieving high efficiency with improved power transfer range and misalignment tolerance is the major design challenge in realizing Wireless Power Transfer (WPT) systems for industrial applications. Resonant coils must be carefully designed to achieve highest possible system performance by fully utilizing the available space. High quality factor and enhanced electromagnetic coupling are key indices which determine the system performance. In this paper, design parameter extraction and quality factor optimization of multi layered helical coils are presented using finite element analysis (FEA) simulations. In addition, a novel Toroidal Shaped Spiral (TSS) coil is proposed to increase power transfer range and misalignment tolerance. The proposed shapes and recommendations can be used to design high efficiency WPT resonator in a limited space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Budgeting is an important means of controlling ones finances and reducing debt. This paper outlines our work towards designing more user centred technology for individual and household budgeting. Based on an ethnographically informed study with 15 participants, we highlight a misalignment between people's actual budgeting practices and those supported by off-the-shelf budgeting aids. In addressing this misalignment we outline three tenets that may be incorporated into future work in this area. These include (1) catering for the different phases of engagement with technology; (2) catering for the practices of hiding and limiting access to money, and; (3) integrating materiality into technical solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the field of face recognition, sparse representation (SR) has received considerable attention during the past few years, with a focus on holistic descriptors in closed-set identification applications. The underlying assumption in such SR-based methods is that each class in the gallery has sufficient samples and the query lies on the subspace spanned by the gallery of the same class. Unfortunately, such an assumption is easily violated in the face verification scenario, where the task is to determine if two faces (where one or both have not been seen before) belong to the same person. In this study, the authors propose an alternative approach to SR-based face verification, where SR encoding is performed on local image patches rather than the entire face. The obtained sparse signals are pooled via averaging to form multiple region descriptors, which then form an overall face descriptor. Owing to the deliberate loss of spatial relations within each region (caused by averaging), the resulting descriptor is robust to misalignment and various image deformations. Within the proposed framework, they evaluate several SR encoding techniques: l1-minimisation, Sparse Autoencoder Neural Network (SANN) and an implicit probabilistic technique based on Gaussian mixture models. Thorough experiments on AR, FERET, exYaleB, BANCA and ChokePoint datasets show that the local SR approach obtains considerably better and more robust performance than several previous state-of-the-art holistic SR methods, on both the traditional closed-set identification task and the more applicable face verification task. The experiments also show that l1-minimisation-based encoding has a considerably higher computational cost when compared with SANN-based and probabilistic encoding, but leads to higher recognition rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The literature around Library 2.0 remains largely theoretical with few empirical studies and is particularly limited in developing countries such as Indonesia. This study addresses this gap and aims to provide information about the current state of knowledge on Indonesian LIS professionals’ understanding of Library 2.0. The researchers used qualitative and quantitative approaches for this study, asking thirteen closed- and open-ended questions in an online survey. The researchers used descriptive and in vivo coding to analyze the responses. Through their analysis, they identified three themes: technology, interactivity, and awareness of Library 2.0. Respondents demonstrated awareness of Library 2.0 and a basic understanding of the roles of interactivity and technology in libraries. However, overreliance on technology used in libraries to conceptualize Library 2.0 without an emphasis on its core characteristics and principles could lead to the misalignment of limited resources. The study results will potentially strengthen the research base for Library 2.0 practice as well as inform LIS curriculum in Indonesia so as to develop practitioners who are able to adapt to users’ changing needs and expectations. It is expected that the preliminary data from this study could be used to design a much larger and more complex future research project in this area.