74 resultados para Component-based systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the past decade or so a number of changes have been observed in traditional Japanese employment relations (ERs) systems such as an increase in non-regular workers, a move towards performance-based systems and a continuous decline in union membership. There is a large body of Anglo-Saxon and Japanese literature providing evidence that national factors such as national institutions, national culture, and the business and economic environment have significantly influenced what were hitherto three ‘sacred’ aspects of Japanese ERs systems (ERSs). However, no research has been undertaken until now at the firm level regarding the extent to which changes in national factors influence ERSs across firms. This article develops a model to examine the impact of national factors on ER systems; and analyses the impact of national factors at the firm level ER systems. Based on information collected from two different groups of companies, namely Mitsubishi Chemical Group (MCG) and Federation of Shinkin Bank (FSB) the research finds that except for a few similarities, the impact of national factors is different on Japanese ER systems at the firm level. This indicates that the impact of national factors varies in the implementation of employment relations factors. In the case of MCG, national culture has less to do with seniority-based system. Study also reveals that the national culture factors have also less influence on an enterprise-based system in the case of FSB. This analysis is useful for domestic and international organizations as it helps to better understand the role of national factors in determining Japanese ERSs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The project consists of an experimental and numerical modelling study of the applications of ultra-long Raman fibre laser (URFL) based amplification techniques for high-speed multi-wavelength optical communications systems. The research is focused in telecommunications C-band 40 Gb/s transmission data rates with direct and coherent detection. The optical transmission performance of URFL based systems in terms of optical noise, gain bandwidth and gain flatness for different system configurations is evaluated. Systems with different overall span lengths, transmission fibre types and data modulation formats are investigated. Performance is compared with conventional Erbium doped fibre amplifier based system to evaluate system configurations where URFL based amplification provide performance or commercial advantages.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conventional tools for measurement of laser spectra (e.g. optical spectrum analysers) capture data averaged over a considerable time period. However, the generation spectrum of many laser types may involve spectral dynamics whose relatively fast time scale is determined by their cavity round trip period, calling for instrumentation featuring both high temporal and spectral resolution. Such real-time spectral characterisation becomes particularly challenging if the laser pulses are long, or they have continuous or quasi-continuous wave radiation components. Here we combine optical heterodyning with a technique of spatiooral intensity measurements that allows the characterisation of such complex sources. Fast, round-trip-resolved spectral dynamics of cavity-based systems in real-time are obtained, with temporal resolution of one cavity round trip and frequency resolution defined by its inverse (85 ns and 24 MHz respectively are demonstrated). We also show how under certain conditions for quasi-continuous wave sources, the spectral resolution could be further increased by a factor of 100 by direct extraction of phase information from the heterodyned dynamics or by using double time scales within the spectrogram approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A plethora of techniques for the imaging of liposomes and other bilayer vesicles are available. However, sample preparation and the technique chosen should be carefully considered in conjunction with the information required. For example, larger vesicles such as multilamellar and giant unilamellar vesicles can be viewed using light microscopy and whilst vesicle confirmation and size prior to additional physical characterisations or more detailed microscopy can be undertaken, the technique is limited in terms of resolution. To consider the options available for visualising liposome-based systems, a wide range of microscopy techniques are described and discussed here: these include light, fluorescence and confocal microscopy and various electron microscopy techniques such as transmission, cryo, freeze fracture and environmental scanning electron microscopy. Their application, advantages and disadvantages are reviewed with regard to their use in analysis of lipid vesicles.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This second issue of Knowledge Management Research & Practice (KMRP) continues the international nature of the first issue, with papers from authors based on four different continents. There are five regular papers, plus the first of what is intended to be an occasional series of 'position papers' from respected figures in the knowledge management field, who have specific issues they wish to raise from a personal standpoint. The first two regular papers are both based on case studies. The first is 'Aggressively pursuing knowledge management over two years: a case study a US government organization' by Jay Liebowitz. Liebowitz is well known to both academics and practictioners as an author on knowledge management and knowledge based systems. Government departments in many Western countries must soon face up to the problems that will occur as the 'baby boomer' generation reaches retirement age over the next decade. This paper describes how one particular US government organization has attempted to address this situation (and others) through the introduction of a knowledge management initiative. The second case study paper is 'Knowledge creation through the synthesizing capability of networked strategic communities: case study on new product development in Japan' by Mitsuru Kodama. This paper looks at the importance of strategic communities - communities that have strategic relevance and support - in knowledge management. Here, the case study organization is Nippon Telegraph and Telephone Corporation (NTT), a Japanese telecommunication firm. The third paper is 'Knowledge management and intellectual capital: an empirical examination of current practice in Australia' by Albert Zhou and Dieter Fink. This paper reports the results of a survey carried out in 2001, exploring the practices relating to knowledge management and intellectual capital in Australia and the relationship between them. The remaining two regular papers are conceptual in nature. The fourth is 'The enterprise knowledge dictionary' by Stuart Galup, Ronald Dattero and Richard Hicks. Galup, Dattero and Hicks propose the concept of an enterprise knowledge dictionary and its associated knowledge management system architecture as offering the appropriate form of information technology to support various different types of knowledge sources, while behaving as a single source from the user's viewpoint. The fifth and final regular paper is 'Community of practice and metacapabilities' by Geri Furlong and Leslie Johnson. This paper looks at the role of communities of practice in learning in organizations. Its emphasis is on metacapabilities - the properties required to learn, develop and apply skills. This discussion takes work on learning and core competences to a higher level. Finally, this issue includes a position paper 'Innovation as an objective of knowledge management. Part I: the landscape of management' by Dave Snowden. Snowden has been highly visible in the knowledge management community thanks to his role as the Director of IBM Global Services' Canolfan Cynefin Centre. He has helped many government and private sector organizations to consider their knowledge management problems and strategies. This, the first of two-part paper, is inspired by the notion of complexity. In it, Snowden calls for what he sees as a 20th century emphasis on designed systems for knowledge management to be consigned to history, and replaced by a 21st century emphasis on emergence. Letters to the editor on this, or any other topic related to knowledge management research and practice, are welcome. We trust that you will find the contributions stimulating, and again invite you to contribute your own paper(s) to future issues of KMRP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Knowledge elicitation is a well-known bottleneck in the production of knowledge-based systems (KBS). Past research has shown that visual interactive simulation (VIS) could effectively be used to elicit episodic knowledge that is appropriate for machine learning purposes, with a view to building a KBS. Nonetheless, the VIS-based elicitation process still has much room for improvement. Based in the Ford Dagenham Engine Assembly Plant, a research project is being undertaken to investigate the individual/joint effects of visual display level and mode of problem case generation on the elicitation process. This paper looks at the methodology employed and some issues that have been encountered to date. Copyright © 2007 Inderscience Enterprises Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this position paper we present the developing Fluid framework, which we believe offers considerable advantages in maintaining software stability in dynamic or evolving application settings. The Fluid framework facilitates the development of component software via the selection, composition and configuration of components. Fluid's composition language incorporates a high-level type system supporting object-oriented principles such as type description, type inheritance, and type instantiation. Object-oriented relationships are represented via the dynamic composition of component instances. This representation allows the software structure, as specified by type and instance descriptions, to change dynamically at runtime as existing types are modified and new types and instances are introduced. We therefore move from static software structure descriptions to more dynamic representations, while maintaining the expressiveness of object-oriented semantics. We show how the Fluid framework relates to existing, largely component based, software frameworks and conclude with suggestions for future enhancements. © 2007 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inclusion of high-level scripting functionality in state-of-the-art rendering APIs indicates a movement toward data-driven methodologies for structuring next generation rendering pipelines. A similar theme can be seen in the use of composition languages to deploy component software using selection and configuration of collaborating component implementations. In this paper we introduce the Fluid framework, which places particular emphasis on the use of high-level data manipulations in order to develop component based software that is flexible, extensible, and expressive. We introduce a data-driven, object oriented programming methodology to component based software development, and demonstrate how a rendering system with a similar focus on abstract manipulations can be incorporated, in order to develop a visualization application for geospatial data. In particular we describe a novel SAS script integration layer that provides access to vertex and fragment programs, producing a very controllable, responsive rendering system. The proposed system is very similar to developments speculatively planned for DirectX 10, but uses open standards and has cross platform applicability. © The Eurographics Association 2007.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper identifies the important limiting processes in transmission capacity for amplified soliton systems. Some novel control techniques are described for optimizing this capacity. In particular, dispersion compensation and phase conjugation are identified as offering good control of jitter without the need for many new components in the system. An advanced average soliton model is described and demonstrated to permit large amplifier spacing. The potential for solitons in high-dispersion land-based systems is discussed and results are presented showing 10 Gbit s$^{-1}$ transmission over 1000 km with significant amplifier spacing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In practical term any result obtained using an ordered weighted averaging (OWA) operator heavily depends upon the method to determine the weighting vector. Several approaches for obtaining the associated weights have been suggested in the literature, in which none of them took into account the preference of alternatives. This paper presents a method for determining the OWA weights when the preferences of alternatives across all the criteria are considered. An example is given to illustrate this method and an application in internet search engine shows the use of this new OWA operator.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Liposomes have been imaged using a plethora of techniques. However, few of these methods offer the ability to study these systems in their natural hydrated state without the requirement of drying, staining, and fixation of the vesicles. However, the ability to image a liposome in its hydrated state is the ideal scenario for visualization of these dynamic lipid structures and environmental scanning electron microscopy (ESEM), with its ability to image wet systems without prior sample preparation, offers potential advantages to the above methods. In our studies, we have used ESEM to not only investigate the morphology of liposomes and niosomes but also to dynamically follow the changes in structure of lipid films and liposome suspensions as water condenses on to or evaporates from the sample. In particular, changes in liposome morphology were studied using ESEM in real time to investigate the resistance of liposomes to coalescence during dehydration thereby providing an alternative assay of liposome formulation and stability. Based on this protocol, we have also studied niosome-based systems and cationic liposome/DNA complexes. Copyright © Informa Healthcare.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This book challenges the accepted notion that the transition from the command economy to market based systems is complete across the post-Soviet space. While it is noted that different political economies have developed in such states, such as Russia’s ‘managed democracy’, events such as Ukraine gaining ‘market economy status’ by the European Union and acceding to the World Trade Organisation in 2008 are taken as evidence that the reform period is over. Such thinking is based on numerous assumptions; specifically that economic transition has defined start and end points, that the formal economy now has primacy over other forms of economic practices and that national economic growth leads to the ‘trickle down’ of wealth to those marginalised by the transition process. Based on extensive ethnographic and quantitative research, conducted in Ukraine and Russia between 2004 - 2007, this book questions these assumptions by stating that the economies that operate across post-Soviet spaces are far from the textbook idea of a market economy. Through this the whole notion of ‘transition’ is problematised and the importance of informal economies to everyday life is demonstrated. Using case studies of various sectors, such as entrepreneurial behaviour and the higher education system, it is also shown how corruption has invaded almost all sectors of the post-Soviet every day.