814 resultados para Calculus of operations.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Resumo:
Purpose – This paper explores the factors which determine the degree of knowledge transfer in inter-firm new product development projects. We test a theoretical model exploring how inter-firm knowledge transfer is enabled or hindered by a buyer’s learning intent, the degree of supplier protectiveness, inter-firm knowledge ambiguity, and absorptive capacity. Design/methodology/approach – A sample of 153 R&D intensive manufacturing firms in the UK automotive, aerospace, pharmaceutical, electrical, chemical, and general manufacturing industries were used to test the framework. Two-step structural equation modeling in AMOS 7.0 was used to analyse the data. Findings – Our results indicate that a buyer’s learning intent increases inter-firm knowledge transfer, but also acts as an incentive for suppliers to protect their knowledge. Such defensive measures increase the degree of inter-firm knowledge ambiguity, encouraging buyer firms to invest in absorptive capacity as a means to interpret supplier knowledge, but also increase the degree of knowledge transfer. Practical implications – Our paper illustrates the effects of focusing on acquisition, rather than accessing, supplier technological knowledge. We show that an overt learning strategy can be detrimental to knowledge transfer between buyer-supplier, as supplier’s react by restricting the flow of information. Organisations are encouraged to consider this dynamic when engaging in multi-organisational new product development projects. Originality/value – This paper examines the dynamics of knowledge transfer within inter-firm NPD projects, showing how transfer is influenced by the buyer firm’s learning intention, supplier’s response, characteristics of the relationship and knowledge to be transferred.
Resumo:
Keeping a record of operator experience remains a challenge to operation management and a major source of inefficiency in information management. The objective is to develop a framework that enables an explicit presentation of experience based on information use. A purposive sampling method is used to select four small and medium-sized enterprises as case studies. The unit of analysis is the production process in the machine shop. Data collection is by structured interview, observation and documentation. A comparative case analysis is applied. The findings suggest experience is an accumulation of tacit information feedback, which can be made explicit in information use interoperatability matrix. The matrix is conditioned upon information use typology, which is strategic in waste reduction. The limitations include difficulty of participant anonymity where the organisation nominates a participant. Areas for further research include application of the concepts to knowledge management and shop floor resource management.
Resumo:
Today there is a growing interest in the integration of health monitoring applications in portable devices necessitating the development of methods that improve the energy efficiency of such systems. In this paper, we present a systematic approach that enables energy-quality trade-offs in spectral analysis systems for bio-signals, which are useful in monitoring various health conditions as those associated with the heart-rate. To enable such trade-offs, the processed signals are expressed initially in a basis in which significant components that carry most of the relevant information can be easily distinguished from the parts that influence the output to a lesser extent. Such a classification allows the pruning of operations associated with the less significant signal components leading to power savings with minor quality loss since only less useful parts are pruned under the given requirements. To exploit the attributes of the modified spectral analysis system, thresholding rules are determined and adopted at design- and run-time, allowing the static or dynamic pruning of less-useful operations based on the accuracy and energy requirements. The proposed algorithm is implemented on a typical sensor node simulator and results show up-to 82% energy savings when static pruning is combined with voltage and frequency scaling, compared to the conventional algorithm in which such trade-offs were not available. In addition, experiments with numerous cardiac samples of various patients show that such energy savings come with a 4.9% average accuracy loss, which does not affect the system detection capability of sinus-arrhythmia which was used as a test case.
Resumo:
It is well known that the absolute magnitudes (H) in the MPCORB and ASTORB orbital element catalogs suffer from a systematic offset. Juric at al. (2002) found 0.4 mag offset in the SDSS data and detailed light curve studies of WISE asteroids by Pravec et al. (2012) revealed size-dependent offsets of up to 0.5 mag. The offsets are thought to be caused by systematic errors introduced by earlier surveys using different photometric catalogs and filters. The next generation asteroid surveys provide an order of magnitude more asteroids and well-defined and calibrated magnitudes. The Pan-STARRS 1 telescope (PS1) has observed hundreds of thousands asteroids, submitted more than 2 million detections to the Minor Planet Center (MPC) and discovered almost 300 NEOs since the beginning of operations in late 2010. We transformed the observed apparent magnitudes of PS1-detected asteroids from the gP1,rP1,iP1,yP1,zP1 and wP1-bands into Johnson photometric system by assuming the mean S and C-type asteroid color (Fitzsimmons 2011 - personal communication, Schlafly et al. 2012, Magnier et al. 2012 - in preparation) and calculated the absolute magnitude (H) in the V-band and its uncertainty (Bowell et al., 1989) for more than 200,000 known asteroids having on average 6.7 detections per object. The H error with respect to the MPCORB catalog revealed a mean offset of -0.49+0.30 mag in good agreement with published values. We will also discuss the statistical and systematical errors in H and slope parameter G.
Resumo:
As pressure for companies to improve their environmental performance has intensified in recent years, research attention has shifted away from establishing a link between environmental practices and performance towards consideration of other factors that might facilitate performance improvements. This paper has two key purposes; firstly, to investigate whether internal support processes interact with pollution prevention by positively moderating the relationship between pollution prevention and environmental performance, and; secondly, to assess whether the relationship between pollution prevention and cost performance is mediated by environmental performance.
Design/methodology/approach
It uses a cross-sectional survey of 1,200 UK-based food processing firms to gather information on environmental practices and performance. Regression analysis was conducted on a sample of 149 responding firms to assess the hypothesised relationships.
Findings
Support was found for two of the four moderated relationships hypothesised namely, suggesting that internal support processes support the environmental performance of some pollution prevention practices. Strong support for a mediated relationship between pollution prevention, environmental performance and cost performance was provided by the results.
Originality/value
This study provides an original contribution to the literature on the performance outcomes of environmental practices by considering a number indirect relationships between environmental practices and performance. This has implications for the interpretation of the relationship between environmental practices and performance.
Resumo:
Introduzimos um cálculo das variações fraccional nas escalas temporais ℤ e (hℤ)!. Estabelecemos a primeira e a segunda condição necessária de optimalidade. São dados alguns exemplos numéricos que ilustram o uso quer da nova condição de Euler–Lagrange quer da nova condição do tipo de Legendre. Introduzimos também novas definições de derivada fraccional e de integral fraccional numa escala temporal com recurso à transformada inversa generalizada de Laplace.
Resumo:
Generalizamos o cálculo Hahn variacional para problemas do cálculo das variações que envolvem derivadas de ordem superior. Estudamos o cálculo quântico simétrico, nomeadamente o cálculo quântico alpha,beta-simétrico, q-simétrico e Hahn-simétrico. Introduzimos o cálculo quântico simétrico variacional e deduzimos equações do tipo Euler-Lagrange para o cálculo q-simétrico e Hahn simétrico. Definimos a derivada simétrica em escalas temporais e deduzimos algumas das suas propriedades. Finalmente, introduzimos e estudamos o integral diamond que generaliza o integral diamond-alpha das escalas temporais.
Resumo:
Field lab: Entrepreneurial and innovative ventures
Resumo:
The study examines international cooperation in product development in software development organisations. The software industry is known for its global nature and knowledge-intensity, which makes it an interesting setting to examine international cooperation in. Software development processes are increasingly distributed worldwide, but for small or even medium-sized enterprises, typical for the software industry, such distribution of operations is often possible only in association with crossing the company’s boundaries. The strategic decision-making of companies is likely to be affected by the characteristics of the industry, and this includes decisions about cooperation or sourcing. The objective of this thesis is to provide a holistic view on factors affecting decisions about offshore sourcing in software development. Offshore sourcing refers to a cooperative mode of offshoring, where a firm does not establish its own presence in a foreign country, but utilises a local supplier. The study examines product development activities that are distributed across organisational and geographical boundaries. The objective can be divided into two subtopics: general reasons for international cooperation in product development and particular reasons for cooperation between Finnish and Russian companies. The focus is on the strategic rationale at the company level, in particular in small and medium-sized enterprises. The theoretical discourse of the study builds upon the literature on international cooperation and networking, with particular focus on cooperation with foreign suppliers and within product development activities. The resource-based view is also discussed, as heterogeneity and interdependency of the resources possessed by different firms are seen as factors motivating international cooperation. Strategically, sourcing can be used to access resources possessed by an industrial network, to enhance the product development of a firm, or to optimise its cost structure. In order to investigate the issues raised by the theoretical review, two empirical studies on international cooperation in software product development have been conducted. The emphasis of the empirical part of the study is on cooperation between Finnish and Russian companies. The data has been gathered through four case studies on Finnish software development organisations and four case studies on Russian offshore suppliers. Based on the material from the case studies, a framework clarifying and grouping the factors that influence offshore sourcing decisions has been built. The findings indicate that decisions regarding offshore sourcing in software development are far more complex than generally assumed. The framework provides a holistic view on factors affecting decisions about offshore sourcing in software development, capturing the multidimensionality of motives for entering offshore cooperation. Four groups of factors emerged from the data: A) strategy-related aspects, B) aspects related to resources and capabilities, C) organisation-related aspects, and D) aspects related to the entrepreneur or management. By developing a holistic framework of decision factors, the research offers in-depth theoreticalunderstanding of offshore sourcing rationale in product development. From the managerial point of view, the proposed framework sums up the issues that a firm should pay attention to when contemplating product development cooperation with foreign suppliers. Understanding different components of sourcing decisions can lead to improved preconditions for strategising and engaging in offshore cooperation. A thorough decisionmaking process should consider all the possible benefits and risks of product development cooperation carefully.
Resumo:
Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.
Resumo:
Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.