933 resultados para Optimal fusion performance
Resumo:
The amplification of demand variation up a supply chain widely termed ‘the Bullwhip Effect’ is disruptive, costly and something that supply chain management generally seeks to minimise. Originally attributed to poor system design; deficiencies in policies, organisation structure and delays in material and information flow all lead to sub-optimal reorder point calculation. It has since been attributed to exogenous random factors such as: uncertainties in demand, supply and distribution lead time but these causes are not exclusive as academic and operational studies since have shown that orders and/or inventories can exhibit significant variability even if customer demand and lead time are deterministic. This increase in the range of possible causes of dynamic behaviour indicates that our understanding of the phenomenon is far from complete. One possible, yet previously unexplored, factor that may influence dynamic behaviour in supply chains is the application and operation of supply chain performance measures. Organisations monitoring and responding to their adopted key performance metrics will make operational changes and this action may influence the level of dynamics within the supply chain, possibly degrading the performance of the very system they were intended to measure. In order to explore this a plausible abstraction of the operational responses to the Supply Chain Council’s SCOR® (Supply Chain Operations Reference) model was incorporated into a classic Beer Game distribution representation, using the dynamic discrete event simulation software Simul8. During the simulation the five SCOR Supply Chain Performance Attributes: Reliability, Responsiveness, Flexibility, Cost and Utilisation were continuously monitored and compared to established targets. Operational adjustments to the; reorder point, transportation modes and production capacity (where appropriate) for three independent supply chain roles were made and the degree of dynamic behaviour in the Supply Chain measured, using the ratio of the standard deviation of upstream demand relative to the standard deviation of the downstream demand. Factors employed to build the detailed model include: variable retail demand, order transmission, transportation delays, production delays, capacity constraints demand multipliers and demand averaging periods. Five dimensions of supply chain performance were monitored independently in three autonomous supply chain roles and operational settings adjusted accordingly. Uniqueness of this research stems from the application of the five SCOR performance attributes with modelled operational responses in a dynamic discrete event simulation model. This project makes its primary contribution to knowledge by measuring the impact, on supply chain dynamics, of applying a representative performance measurement system.
Resumo:
Proper maintenance of plant items is crucial for the safe and profitable operation of process plants, The relevant maintenance policies fall into the following four categories: (i) preventivejopportunistic/breakdown replacement policies, (ii) inspection/inspection-repair-replacernent policies, (iii) restorative maintenance policies, and (iv) condition based maintenance policies, For correlating failure times of component equipnent and complete systems, the Weibull failure distribution has been used, A new powerful method, SEQLIM, has been proposed for the estimation of the Weibull parameters; particularly, when maintenance records contain very few failures and many successful operation times. When a system consists of a number of replaceable, ageing components, an opporturistic replacernent policy has been found to be cost-effective, A simple opportunistic rrodel has been developed. Inspection models with various objective functions have been investigated, It was found that, on the assumption of a negative exponential failure distribution, all models converge to the same optimal inspection interval; provided the safety components are very reliable and the demand rate is low, When deterioration becomes a contributory factor to same failures, periodic inspections, calculated from above models, are too frequent, A case of safety trip systems has been studied, A highly effective restorative maintenance policy can be developed if the performance of the equipment under this category can be related to some predictive modelling. A novel fouling model has been proposed to determine cleaning strategies of condensers, Condition-based maintenance policies have been investigated. A simple gauge has been designed for condition monitoring of relief valve springs. A typical case of an exothermic inert gas generation plant has been studied, to demonstrate how various policies can be applied to devise overall maintenance actions.
Resumo:
We address the question of how to obtain effective fusion of identification information such that it is robust to the quality of this information. As well as technical issues data fusion is encumbered with a collection of (potentially confusing) practical considerations. These considerations are described during the early chapters in which a framework for data fusion is developed. Following this process of diversification it becomes clear that the original question is not well posed and requires more precise specification. We use the framework to focus on some of the technical issues relevant to the question being addressed. We show that fusion of hard decisions through use of an adaptive version of the maximum a posteriori decision rule yields acceptable performance. Better performance is possible using probability level fusion as long as the probabilities are accurate. Of particular interest is the prevalence of overconfidence and the effect it has on fused performance. The production of accurate probabilities from poor quality data forms the latter part of the thesis. Two approaches are taken. Firstly the probabilities may be moderated at source (either analytically or numerically). Secondly, the probabilities may be transformed at the fusion centre. In each case an improvement in fused performance is demonstrated. We therefore conclude that in order to obtain robust fusion care should be taken to model the probabilities accurately; either at the source or centrally.
Resumo:
A number of researchers have investigated the impact of network architecture on the performance of artificial neural networks. Particular attention has been paid to the impact on the performance of the multi-layer perceptron of architectural issues, and the use of various strategies to attain an optimal network structure. However, there are still perceived limitations with the multi-layer perceptron and networks that employ a different architecture to the multi-layer perceptron have gained in popularity in recent years, particularly, networks that implement a more localised solution, where the solution in one area of the problem space does not impact, or has a minimal impact, on other areas of the space. In this study, we discuss the major architectural issues affecting the performance of a multi-layer perceptron, before moving on to examine in detail the performance of a new localised network, namely the bumptree. The work presented here examines the impact on the performance of artificial neural networks of employing alternative networks to the long established multi-layer perceptron. In particular, networks that impose a solution where the impact of each parameter in the final network architecture has a localised impact on the problem space being modelled are examined. The alternatives examined are the radial basis function and bumptree neural networks, and the impact of architectural issues on the performance of these networks is examined. Particular attention is paid to the bumptree, with new techniques for both developing the bumptree structure and employing this structure to classify patterns being examined.
Resumo:
IEEE 802.11 standard has achieved huge success in the past decade and is still under development to provide higher physical data rate and better quality of service (QoS). An important problem for the development and optimization of IEEE 802.11 networks is the modeling of the MAC layer channel access protocol. Although there are already many theoretic analysis for the 802.11 MAC protocol in the literature, most of the models focus on the saturated traffic and assume infinite buffer at the MAC layer. In this paper we develop a unified analytical model for IEEE 802.11 MAC protocol in ad hoc networks. The impacts of channel access parameters, traffic rate and buffer size at the MAC layer are modeled with the assistance of a generalized Markov chain and an M/G/1/K queue model. The performance of throughput, packet delivery delay and dropping probability can be achieved. Extensive simulations show the analytical model is highly accurate. From the analytical model it is shown that for practical buffer configuration (e.g. buffer size larger than one), we can maximize the total throughput and reduce the packet blocking probability (due to limited buffer size) and the average queuing delay to zero by effectively controlling the offered load. The average MAC layer service delay as well as its standard deviation, is also much lower than that in saturated conditions and has an upper bound. It is also observed that the optimal load is very close to the maximum achievable throughput regardless of the number of stations or buffer size. Moreover, the model is scalable for performance analysis of 802.11e in unsaturated conditions and 802.11 ad hoc networks with heterogenous traffic flows. © 2012 KSI.
Resumo:
The use of high intensity femtosecond laser sources for inscribing fibre gratings has attained significant interest. The principal advantage of high-energy pulses is their ability for grating inscription in any material type without preprocessing or special core doping. In the field of fibre optical sensing LPGs written in photonic crystal fibre have a distinct advantage of low temperature sensitivity over gratings written in conventional fibre and thus minimal temperature cross-sensitivity. Previous studies have indicated that LPGs written by a point-by-point inscription scheme using a low repetition femtosecond laser exhibit post-fabrication evolution leading to temporal instabilities at room temperatures with respect to spectral location, strength and birefringence of the attenuation bands. These spectral instabilities of LPGs are studied in photonic crystal fibres (endlessly single mode microstructure fibre) to moderately high temperatures 100°C to 200°C and their performance compared to fusion-arc fabricated LPG. Initial results suggest that the fusion-arc fabricated LPG demonstrate less spectral instability for a given constant and moderate temperature, and are similar to the results obtained when inscribed in a standard single mode fibre.
Resumo:
Using a unique firm level data, this paper analyses the role of political connections in the post-entry performance of private start-up companies in China. It documents robust evidence that political affiliation enhances firms' survival and growth prospects. But interestingly politically neutral start-ups enjoy faster productivity improvements conditional on survival. In addition, the benefits of political connections are largely confined to firms associated with local or top level governments, and they are more pronounced in capital-intensive industries. We conclude that the close association between the state and a segment of the business community is leading to sub-optimal resource allocation in the economy by interfering with the process of market selection.
Resumo:
In this paper, we use plant-level data from two Indian industries, namely, electrical machinery and textiles, to examine the empirical relationship between structural reforms like abandonment of entry restrictions to the product market, competition and firm-level productivity and efficiency. These industries have faced different sets of policies since Independence but both were restricted in the adoption of technology and in the development of optimal scales of production. They also belonged to the first set of industries that benefited from the liberalization process started in the 1980s. Our results suggest that both the industries have improved their efficiency and scales of operation by the turn of the century. However, the process of adjustment seems to have been worked out more fully for electrical machinery. We also find evidence of spatial fragmentation of the market as late as 2000–2001. Gains in labour productivity were much more evident in states that either have a strong history of industrial activity or those that have experienced significant improvements in business environment since 1991.
Resumo:
With careful calculation of signal forwarding weights, relay nodes can be used to work collaboratively to enhance downlink transmission performance by forming a virtual multiple-input multiple-output beamforming system. Although collaborative relay beamforming schemes for single user have been widely investigated for cellular systems in previous literatures, there are few studies on the relay beamforming for multiusers. In this paper, we study the collaborative downlink signal transmission with multiple amplify-and-forward relay nodes for multiusers in cellular systems. We propose two new algorithms to determine the beamforming weights with the same objective of minimizing power consumption of the relay nodes. In the first algorithm, we aim to guarantee the received signal-to-noise ratio at multiusers for the relay beamforming with orthogonal channels. We prove that the solution obtained by a semidefinite relaxation technology is optimal. In the second algorithm, we propose an iterative algorithm that jointly selects the base station antennas and optimizes the relay beamforming weights to reach the target signal-to-interference-and-noise ratio at multiusers with nonorthogonal channels. Numerical results validate our theoretical analysis and demonstrate that the proposed optimal schemes can effectively reduce the relay power consumption compared with several other beamforming approaches. © 2012 John Wiley & Sons, Ltd.
Resumo:
We examine the correlations between the parameters of ultra-narrow off-centred filtering and pulse width on the performance of a wavelength paired Nx40Gbit/s DWDM transmission, consisting of carrier suppressed return-to-zero signal with 0.64 bit/s/Hz (without polarization-division multiplexing) spectral efficiency.
Resumo:
We investigate the transmission performance of advanced modulation formats in nonlinear regenerative channels based on cascaded phase sensitive amplifiers. We identify the impact of amplitude and phase noise dynamics along the transmission line and show that after a cascade of regenerators, densely packed single ring PSK constellations outperform multi-ring constellations. The results of this study will greatly simplify the design of future nonlinear regenerative channels for ultra-high capacity transmission
Resumo:
Cognitive Radio has been proposed as a key technology to significantly improve spectrum usage in wireless networks by enabling unlicensed users to access unused resource. We present new algorithms that are needed for the implementation of opportunistic scheduling policies that maximize the throughput utilization of resources by secondary users, under maximum interference constraints imposed by existing primary users. Our approach is based on the Belief Propagation (BP) algorithm, which is advantageous due to its simplicity and potential for distributed implementation. We examine convergence properties and evaluate the performance of the proposed BP algorithms via simulations and demonstrate that the results compare favorably with a benchmark greedy strategy. © 2013 IEEE.
Resumo:
We show, using nonlinearity management, that the optimal performance in high-bit-rate dispersion-managed fiber systems with hybrid amplification is achieved for a specific amplifier spacing that is different from the asymptotically vanishing length corresponding to ideally distributed amplification [Opt. Lett. 15, 1064 (1990)]. In particular, we prove the existence of a nontrivial optimal span length for 40-Gbit/s wavelength-division transmission systems with Raman-erbium-doped fiber amplification. Optimal amplifier lengths are obtained for several dispersion maps based on commonly used transmission fibers. © 2005 Optical Society of America.
Resumo:
With the features of low-power and flexible networking capabilities IEEE 802.15.4 has been widely regarded as one strong candidate of communication technologies for wireless sensor networks (WSNs). It is expected that with an increasing number of deployments of 802.15.4 based WSNs, multiple WSNs could coexist with full or partial overlap in residential or enterprise areas. As WSNs are usually deployed without coordination, the communication could meet significant degradation with the 802.15.4 channel access scheme, which has a large impact on system performance. In this thesis we are motivated to investigate the effectiveness of 802.15.4 networks supporting WSN applications with various environments, especially when hidden terminals are presented due to the uncoordinated coexistence problem. Both analytical models and system level simulators are developed to analyse the performance of the random access scheme specified by IEEE 802.15.4 medium access control (MAC) standard for several network scenarios. The first part of the thesis investigates the effectiveness of single 802.15.4 network supporting WSN applications. A Markov chain based analytic model is applied to model the MAC behaviour of IEEE 802.15.4 standard and a discrete event simulator is also developed to analyse the performance and verify the proposed analytical model. It is observed that 802.15.4 networks could sufficiently support most WSN applications with its various functionalities. After the investigation of single network, the uncoordinated coexistence problem of multiple 802.15.4 networks deployed with communication range fully or partially overlapped are investigated in the next part of the thesis. Both nonsleep and sleep modes are investigated with different channel conditions by analytic and simulation methods to obtain the comprehensive performance evaluation. It is found that the uncoordinated coexistence problem can significantly degrade the performance of 802.15.4 networks, which is unlikely to satisfy the QoS requirements for many WSN applications. The proposed analytic model is validated by simulations which could be used to obtain the optimal parameter setting before WSNs deployments to eliminate the interference risks.
Resumo:
New sol-gel functionalized poly-ethylene glycol (PEGM)/SiO