829 resultados para outsourcing computation
Resumo:
This paper discusses the principal domains of auto- and cross-trispectra. It is shown that the cumulant and moment based trispectra are identical except on certain planes in trifrequency space. If these planes are avoided, their principal domains can be derived by considering the regions of symmetry of the fourth order spectral moment. The fourth order averaged periodogram will then serve as an estimate for both cumulant and moment trispectra. Statistics of estimates of normalised trispectra or tricoherence are also discussed.
Resumo:
Purpose – To determine whether or not clockspeed is an important variable in outsourcing strategies throughout the development of radical innovations. Design/methodology/approach – An internet-based survey of manufacturing firms from all over the world. Findings – An industry's clockspeed does not play a significant role in the success or failure of a particular outsourcing strategy for a radical innovation. Research limitations/implications – Conclusions from earlier research in this area are not necessarily industry-specific. Practical implications – Lessons learned via previous investigations about the computer industry need not be confined to that sector. Vertical integration may be a more robust outsourcing strategy when developing a radical innovation in industries of all clockspeeds. Originality/value – Previous research efforts in this field focused on a single technology jump, but this approach may have overlooked a potentially important variable: industry clockspeed. Thus, this investigation explores whether clockspeed is an important factor.
Resumo:
Significant empirical data from the fields of management and business strategy suggest that it is a good idea for a company to make in-house the components and processes underpinning a new technology. Other evidence suggests exactly the opposite, saying that firms would be better off buying components and processes from outside suppliers. One possible explanation for this lack of convergence is that earlier research in this area has overlooked two important aspects of the problem: reputation and trust. To gain insight into how these variables may impact make-buy decisions throughout the innovation process, the Sporas algorithm for measuring reputation was added to an existing agent-based model of how firms interact with each other throughout the development of new technologies. The model�s results suggest that reputation and trust do not play a significant role in the long-term fortunes of an individual firm as it contends with technological change in the marketplace. Accordingly, this model serves as a cue for management researchers to investigate more thoroughly the temporal limitations and contingencies that determine how the trust between firms may affect the R&D process.
Practical improvements to simultaneous computation of multi-view geometry and radial lens distortion
Resumo:
This paper discusses practical issues related to the use of the division model for lens distortion in multi-view geometry computation. A data normalisation strategy is presented, which has been absent from previous discussions on the topic. The convergence properties of the Rectangular Quadric Eigenvalue Problem solution for computing division model distortion are examined. It is shown that the existing method can require more than 1000 iterations when dealing with severe distortion. A method is presented for accelerating convergence to less than 10 iterations for any amount of distortion. The new method is shown to produce equivalent or better results than the existing method with up to two orders of magnitude reduction in iterations. Through detailed simulation it is found that the number of data points used to compute geometry and lens distortion has a strong influence on convergence speed and solution accuracy. It is recommended that more than the minimal number of data points be used when computing geometry using a robust estimator such as RANSAC. Adding two to four extra samples improves the convergence rate and accuracy sufficiently to compensate for the increased number of samples required by the RANSAC process.
Resumo:
Modelling an environmental process involves creating a model structure and parameterising the model with appropriate values to accurately represent the process. Determining accurate parameter values for environmental systems can be challenging. Existing methods for parameter estimation typically make assumptions regarding the form of the Likelihood, and will often ignore any uncertainty around estimated values. This can be problematic, however, particularly in complex problems where Likelihoods may be intractable. In this paper we demonstrate an Approximate Bayesian Computational method for the estimation of parameters of a stochastic CA. We use as an example a CA constructed to simulate a range expansion such as might occur after a biological invasion, making parameter estimates using only count data such as could be gathered from field observations. We demonstrate ABC is a highly useful method for parameter estimation, with accurate estimates of parameters that are important for the management of invasive species such as the intrinsic rate of increase and the point in a landscape where a species has invaded. We also show that the method is capable of estimating the probability of long distance dispersal, a characteristic of biological invasions that is very influential in determining spread rates but has until now proved difficult to estimate accurately.
Resumo:
In this paper we analyse the oursourcing of accounting services. The extent to which firms are currently outsourcing, or considering outsourcing such services, and the motivations and barriers associated with outsourcing are identified. Empirical data from a random sample of accounting firms are used in this analysis. Data indicate that the majority of accounting firms are either currently outsourcing or considering outsourcing and that they exopect the volume of oursourced services to increase. In contrast to the scholarly literature advocating labor arbitrage as the primary driver for organizations choosing to outsource, in this study it was found that the main factors underpinning the decision to outsource were the expediting of service delivary to clients, and to enable the firm to focus on its core competencies.
Resumo:
In this paper, we apply a simulation based approach for estimating transmission rates of nosocomial pathogens. In particular, the objective is to infer the transmission rate between colonised health-care practitioners and uncolonised patients (and vice versa) solely from routinely collected incidence data. The method, using approximate Bayesian computation, is substantially less computer intensive and easier to implement than likelihood-based approaches we refer to here. We find through replacing the likelihood with a comparison of an efficient summary statistic between observed and simulated data that little is lost in the precision of estimated transmission rates. Furthermore, we investigate the impact of incorporating uncertainty in previously fixed parameters on the precision of the estimated transmission rates.
Resumo:
Patient satisfaction with foodservices is multidimensional. It is well recognised that food and other aspects of foodservice delivery are important elements of patients overall perception of the hospital experience. This study aimed to determine whether menu changes in 2008 at an acute private hospital, considered negative by the dietetic staff, would affect patient satisfaction with the foodservice. Changes to the menu, secondary to the refurbishment of the foodservice facilities decreased the number of choices at breakfast from six to four, and altered the dessert menu to include a larger proportion of commercially produced products. The Acute Care Hospital Foodservice Patient Satisfaction Questionnaire (ACHFPSQ) was utilised to assess patient satisfaction with the menu changes, as it has proven accuracy and reliability in measuring patient satisfaction. Results of the survey (n=306) were compared to data with previous ACHFPSQ surveys conducted annually since 2003. Data analysed included overall foodservice satisfaction and four dimensions of foodservice satisfaction: food quality, meal service quality, staff/service issues and the physical environment. Satisfaction targets were set at 4 (scale 1–5) for each foodservice dimension. Analysis showed that despite changes to the menu, overall foodservice satisfaction rated high, with a score of 4.3. Eighty-six percent of patients rated the foodservice as either ‘very good’ or ‘good’. The four foodservice dimensions were rated highly (4.2–4.8). Findings were consistent with previous survey results, demonstrating a high level of patient satisfaction across all dimensions of the foodservice, despite changes to the menu. The annual ACHFPSQ was of value to this practice question.
Resumo:
Appearance-based localization is increasingly used for loop closure detection in metric SLAM systems. Since it relies only upon the appearance-based similarity between images from two locations, it can perform loop closure regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale linearly not only with the size of the environment but also with the operation time of the platform. These properties impose severe restrictions on longterm autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. We present a set of improvements to the appearance-based SLAM algorithm CAT-SLAM to constrain computation scaling and memory usage with minimal degradation in performance over time. The appearance-based comparison stage is accelerated by exploiting properties of the particle observation update, and nodes in the continuous trajectory map are removed according to minimal information loss criteria. We demonstrate constant time and space loop closure detection in a large urban environment with recall performance exceeding FAB-MAP by a factor of 3 at 100% precision, and investigate the minimum computational and memory requirements for maintaining mapping performance.
Resumo:
Premature convergence to local optimal solutions is one of the main difficulties when using evolutionary algorithms in real-world optimization problems. To prevent premature convergence and degeneration phenomenon, this paper proposes a new optimization computation approach, human-simulated immune evolutionary algorithm (HSIEA). Considering that the premature convergence problem is due to the lack of diversity in the population, the HSIEA employs the clonal selection principle of artificial immune system theory to preserve the diversity of solutions for the search process. Mathematical descriptions and procedures of the HSIEA are given, and four new evolutionary operators are formulated which are clone, variation, recombination, and selection. Two benchmark optimization functions are investigated to demonstrate the effectiveness of the proposed HSIEA.
Resumo:
This paper presents a novel evolutionary computation approach to three-dimensional path planning for unmanned aerial vehicles (UAVs) with tactical and kinematic constraints. A genetic algorithm (GA) is modified and extended for path planning. Two GAs are seeded at the initial and final positions with a common objective to minimise their distance apart under given UAV constraints. This is accomplished by the synchronous optimisation of subsequent control vectors. The proposed evolutionary computation approach is called synchronous genetic algorithm (SGA). The sequence of control vectors generated by the SGA constitutes to a near-optimal path plan. The resulting path plan exhibits no discontinuity when transitioning from curve to straight trajectories. Experiments and results show that the paths generated by the SGA are within 2% of the optimal solution. Such a path planner when implemented on a hardware accelerator, such as field programmable gate array chips, can be used in the UAV as on-board replanner, as well as in ground station systems for assisting in high precision planning and modelling of mission scenarios.
Resumo:
Security of RFID authentication protocols has received considerable interest recently. However, an important aspect of such protocols that has not received as much attention is the efficiency of their communication. In this paper we investigate the efficiency benefits of pre-computation for time-constrained applications in small to medium RFID networks. We also outline a protocol utilizing this mechanism in order to demonstrate the benefits and drawbacks of using thisapproach. The proposed protocol shows promising results as it is able to offer the security of untraceableprotocols whilst only requiring the time comparable to that of more efficient but traceable protocols.
Multi-level knowledge transfer in software development outsourcing projects : the agency theory view
Resumo:
In recent years, software development outsourcing has become even more complex. Outsourcing partner have begun‘re- outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies, creating a multi-level hierarchy of outsourcing. This research in progress paper presents preliminary findings of a study designed to understand knowledge transfer effectiveness of multi-level software development outsourcing projects. We conceptualize the SD-outsourcing entities using the Agency Theory. This study conceptualizes, operationalises and validates the concept of Knowledge Transfer as a three-phase multidimensional formative index of 1) Domain knowledge, 2) Communication behaviors, and 3) Clarity of requirements. Data analysis identified substantial, significant differences between the Principal and the Agent on two of the three constructs. Using Agency Theory, supported by preliminary findings, the paper also provides prescriptive guidelines of reducing the friction between the Principal and the Agent in multi-level software outsourcing.
Resumo:
Approximate Bayesian computation has become an essential tool for the analysis of complex stochastic models when the likelihood function is numerically unavailable. However, the well-established statistical method of empirical likelihood provides another route to such settings that bypasses simulations from the model and the choices of the approximate Bayesian computation parameters (summary statistics, distance, tolerance), while being convergent in the number of observations. Furthermore, bypassing model simulations may lead to significant time savings in complex models, for instance those found in population genetics. The Bayesian computation with empirical likelihood algorithm we develop in this paper also provides an evaluation of its own performance through an associated effective sample size. The method is illustrated using several examples, including estimation of standard distributions, time series, and population genetics models.