856 resultados para DYNAMICAL-SYSTEMS APPROACH
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.
Resumo:
Ashby wrote about cybernetics, during which discourse he described a Law that attempts to resolve difficulties arising in complex situations – he suggested using variety to combat complexity. In this paper, we note that the delegates to the UN Framework Convention on Climate Change (UNFCCC) meeting in Kyoto, 1997, were offered a ‘simplifying solution’ to cope with the complexity of discussing multiple pollutants allegedly contributing to ‘climate change’. We assert that the adoption of CO2eq has resulted in imprecise thinking regarding the ‘carbon footprint’ – that is, ‘CO2’ – to the exclusion of other pollutants. We propose, as Ashby might have done, that the CO2eq and other factors within the ‘climate change’ negotiations be disaggregated to allow careful and specific individual solutions to be agreed on each factor. We propose a new permanent and transparent ‘action group’ be in charge of agenda setting and to manage the messy annual meetings. This body would be responsible for achieving accords at these annual meetings, rather than forcing this task on national hosts. We acknowledge the task is daunting and we recommend moving on from Ashby's Law to Beer's Viable Systems approach.
Resumo:
A range of physical and engineering systems exhibit an irregular complex dynamics featuring alternation of quiet and burst time intervals called the intermittency. The intermittent dynamics most popular in laser science is the on-off intermittency [1]. The on-off intermittency can be understood as a conversion of the noise in a system close to an instability threshold into effective time-dependent fluctuations which result in the alternation of stable and unstable periods. The on-off intermittency has been recently demonstrated in semiconductor, Erbium doped and Raman lasers [2-5]. Recently demonstrated random distributed feedback (random DFB) fiber laser has an irregular dynamics near the generation threshold [6,7]. Here we show the intermittency in the cascaded random DFB fiber laser. We study intensity fluctuations in a random DFB fiber laser based on nitrogen doped fiber. The laser generates first and second Stokes components 1120 nm and 1180 nm respectively under an appropriate pumping. We study the intermittency in the radiation of the second Stokes wave. The typical time trace near the generation threshold of the second Stokes wave (Pth) is shown at Fig. 1a. From the number of long enough time-traces we calculate statistical distribution between major spikes in time dynamics, Fig. 1b. To eliminate contribution of high frequency components of spikes we use a low pass filter along with the reference value of the output power. Experimental data is fitted by power law,
Resumo:
We discuss some main points of computer-assisted proofs based on reliable numerical computations. Such so-called self-validating numerical methods in combination with exact symbolic manipulations result in very powerful mathematical software tools. These tools allow proving mathematical statements (existence of a fixed point, of a solution of an ODE, of a zero of a continuous function, of a global minimum within a given range, etc.) using a digital computer. To validate the assertions of the underlying theorems fast finite precision arithmetic is used. The results are absolutely rigorous. To demonstrate the power of reliable symbolic-numeric computations we investigate in some details the verification of very long periodic orbits of chaotic dynamical systems. The verification is done directly in Maple, e.g. using the Maple Power Tool intpakX or, more efficiently, using the C++ class library C-XSC.
Resumo:
2000 Mathematics Subject Classification: 35J70, 35P15.
Resumo:
The traditional use of global and centralised control methods, fails for large, complex, noisy and highly connected systems, which typify many real world industrial and commercial systems. This paper provides an efficient bottom up design of distributed control in which many simple components communicate and cooperate to achieve a joint system goal. Each component acts individually so as to maximise personal utility whilst obtaining probabilistic information on the global system merely through local message-passing. This leads to an implied scalable and collective control strategy for complex dynamical systems, without the problems of global centralised control. Robustness is addressed by employing a fully probabilistic design, which can cope with inherent uncertainties, can be implemented adaptively and opens a systematic rich way to information sharing. This paper opens the foreseen direction and inspects the proposed design on a linearised version of coupled map lattice with spatiotemporal chaos. A version close to linear quadratic design gives an initial insight into possible behaviours of such networks.
Resumo:
In October 2008, the 5th Environmental Management for Sustainable Universities (EMSU) international conference was held in Barcelona, Spain. It dealt with the need to rethink how our higher educational institutions are facing sustainability. This special issue has been primarily derived from contributions to that conference. This issue builds upon related academic international publications, which have analysed how to use the critical position of universities to accelerate their pace of working to help to make the transition to truly SUSTAINABLE SOCIETIES! This issue focus is on the ‘softer’ issues, such as changes in values, attitudes, motivations, as well as in curricula, societal interactions and assessments of the impacts of research. Insights derived from the interplay of the ‘softer’ issues with the ‘harder’ issues are empowering academic leaders to effectively use leverage points to make changes in operations, courses, curricula, and research. Those changes are being designed to help their students and faculty build resilient and sustainable societies within the context of climate change, the Decade of Education for Sustainable Development (DESD), and the UN Millennium Development Goals (MDGs). The overall systems approach presented by Stephens and Graham provides a structured framework to systematize change for sustainability in higher education, by stressing on the one hand the need for “learning to learn” and on the other hand by integrating leadership and cultural aspects. The “niche” level they propose for innovative interactions between practitioners such as EMSU is exemplary developed by all of the other documents in this special issue. To highlight some of the key elements of the articles in this issue, there are proposals for new educational methods based in sustainability science, a set of inspirational criteria for SD research activities, new course ranking and assessment methods and results of psychological studies that provide evidence that participatory approaches are the most effective way to change values within university members in order to facilitate the development and sharing of new sustainability norms.
Resumo:
Taking a behavioral systems approach to autism, early hidden communicative deficits are introduced as precursors of autistic development. This paper argues that early identification of communication (language and cognition) impairments followed by intensive behavioral interventions, as early as infancy, may have the most preventive effect on the development of autism.
Resumo:
Traditional methods of financing infrastructure, which include gas taxation, tax-exempt bonds, and reserve funds, have not been able to meet the growing demand for infrastructure. Innovative financing systems have emerged to close the gap that exists between the available and needed financing sources. The objective of the study presented in this paper is to assess determinants of innovative financing in the U.S. transportation infrastructure using a systemic approach. Innovation System of Systems approach is adopted for systemic assessment and a case-based research approach is utilized to explore the constituents of innovative financing for U.S. transportation infrastructure. The findings, which include constructs regarding the players, practices, and activities are used to create a model to enable understanding the dynamics of the drivers and inhibitors of innovation and, thus, to derive implications for practice. The model along with the constructs provides an analytical tool for practitioners in the U.S. transportation infrastructure.
Resumo:
The great interest in nonlinear system identification is mainly due to the fact that a large amount of real systems are complex and need to have their nonlinearities considered so that their models can be successfully used in applications of control, prediction, inference, among others. This work evaluates the application of Fuzzy Wavelet Neural Networks (FWNN) to identify nonlinear dynamical systems subjected to noise and outliers. Generally, these elements cause negative effects on the identification procedure, resulting in erroneous interpretations regarding the dynamical behavior of the system. The FWNN combines in a single structure the ability to deal with uncertainties of fuzzy logic, the multiresolution characteristics of wavelet theory and learning and generalization abilities of the artificial neural networks. Usually, the learning procedure of these neural networks is realized by a gradient based method, which uses the mean squared error as its cost function. This work proposes the replacement of this traditional function by an Information Theoretic Learning similarity measure, called correntropy. With the use of this similarity measure, higher order statistics can be considered during the FWNN training process. For this reason, this measure is more suitable for non-Gaussian error distributions and makes the training less sensitive to the presence of outliers. In order to evaluate this replacement, FWNN models are obtained in two identification case studies: a real nonlinear system, consisting of a multisection tank, and a simulated system based on a model of the human knee joint. The results demonstrate that the application of correntropy as the error backpropagation algorithm cost function makes the identification procedure using FWNN models more robust to outliers. However, this is only achieved if the gaussian kernel width of correntropy is properly adjusted.
Resumo:
Characterization of the genomic basis underlying schistosome biology is an important strategy for the development of future treatments and interventions. Genomic sequence is now available for the three major clinically relevant schistosome species, Schistosoma mansoni, S. japonicum and S. haematobium, and this information represents an invaluable resource for the future control of human schistosomiasis. The identification of a biologically important, but distinct from the host, schistosome gene product is the ultimate goal for many research groups. While the initial elucidation of the genome of an organism is critical for most biological research, continued improvement or curation of the genome construction should be an ongoing priority. In this review we will discuss prominent recent findings utilizing a systems approach to schistosome biology, as well as the increased use of interference RNA (RNAi). Both of these research strategies are aiming to place parasite genes into a more meaningful biological perspective.
Resumo:
[EN]Active Vision Systems can be considered as dynamical systems which close the loop around artificial visual perception, controlling camera parameters, motion and also controlling processing to simplify, accelerate and do more robust visual perception. Research and Development in Active Vision Systems [Aloi87], [Bajc88] is a main area of interest in Computer Vision, mainly by its potential application in different scenarios where real-time performance is needed such as robot navigation, surveillance, visual inspection, among many others. Several systems have been developed during last years using robotic-heads for this purpose...