900 resultados para Operational and network efficiency


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is desirable that energy performance improvement is not realized at the expense of other network performance parameters. This paper investigates the trade off between energy efficiency, spectral efficiency and user QoS performance for a multi-cell multi-user radio access network. Specifically, the energy consumption ratio (ECR) and the spectral efficiency of several common frequency domain packet schedulers in a cellular E-UTRAN downlink are compared for both the SISO transmission mode and the 2x2 Alamouti Space Frequency Block Code (SFBC) MIMO transmission mode. It is well known that the 2x2 SFBC MIMO transmission mode is more spectrally efficient compared to the SISO transmission mode, however, the relationship between energy efficiency and spectral efficiency is undecided. It is shown that, for the E-UTRAN downlink with fixed transmission power, spectral efficiency improvement results into energy efficiency improvement. The effect of SFBC MIMO versus SISO on the user QoS performance is also studied. © 2011 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper studies the impact that a change from a dealer system to a market-maker supported auction system has on market quality. We study the impact that the introduction of SETSmm at the London Stock Exchange had on firm value, price efficiency and liquidity. We discover a small SETSmm return premium associated with the announcement that securities are to migrate to the new trading system. Moreover, securities that migrate to SETSmm are characterized by improvements to liquidity and pricing efficiency. We find that these changes are related to the return premium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Academia has followed the interest by companies in establishing industrial networks by studying aspects such as social interaction and contractual relationships. But what patterns underlie the emergence of industrial networks and what support should research provide for practitioners? Firstly, it seems that manufacturing is becoming a commodity rather than a unique capability, which accounts especially for low-technology approaches in downstream parts of the network, for example in assembly operations. Secondly, the increased tendency to specialize forces other parts of industrial networks to introduce advanced manufacturing technologies for niche markets. Thirdly, the capital market for investments in capacity and the trade in manufacturing as a commodity dominates resource allocation to a larger extent. Fourthly, there will be a continuous move toward more loosely connected entities forming manufacturing networks. More traditional concepts, like keiretsu and chaibol networks, do not sufficiently support this transition. Research should address these fundamental challenges to prepare for the industrial networks of 2020 and beyond.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explores demand and production management challenges in the food processing industry. The goal is to identify the main production planning constraints and secondly to explore how each of these constraints affects company’s performance in terms of costs and customer service level. A single case study methodology was preferred since it enabled the collection of in-depth data. Findings suggest that product shelf life, carcass utilization and production lead time are the main constraints affecting supply chain efficiency and hence, a single planning approach is not appropriate when different products have different technological and processing characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we examine the impact that the new trading system SETSmm had on market quality measures such as firm value, liquidity and pricing efficiency. This system was introduced for mid-cap securities on the London Stock Exchange in 2003. We show that there is a small SETSmm return premium associated with the announcement that securities are to migrate to the new trading system. We find that migration to SETSmm also improves liquidity and pricing efficiency and these changes are related to the return premium. We also find that these gains are stronger for firms with high pre SETSmm liquidity and weaker for firms with low SETSmm liquidity. © 2013 John Wiley & Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart cameras allow pre-processing of video data on the camera instead of sending it to a remote server for further analysis. Having a network of smart cameras allows various vision tasks to be processed in a distributed fashion. While cameras may have different tasks, we concentrate on distributed tracking in smart camera networks. This application introduces various highly interesting problems. Firstly, how can conflicting goals be satisfied such as cameras in the network try to track objects while also trying to keep communication overhead low? Secondly, how can cameras in the network self adapt in response to the behavior of objects and changes in scenarios, to ensure continued efficient performance? Thirdly, how can cameras organise themselves to improve the overall network's performance and efficiency? This paper presents a simulation environment, called CamSim, allowing distributed self-adaptation and self-organisation algorithms to be tested, without setting up a physical smart camera network. The simulation tool is written in Java and hence allows high portability between different operating systems. Relaxing various problems of computer vision and network communication enables a focus on implementing and testing new self-adaptation and self-organisation algorithms for cameras to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents a new network-flow interpretation of Łukasiewicz’s logic based on models with an increased effectiveness. The obtained results show that the presented network-flow models principally may work for multivalue logics with more than three states of the variables i.e. with a finite set of states in the interval from 0 to 1. The described models give the opportunity to formulate various logical functions. If the results from a given model that are contained in the obtained values of the arc flow functions are used as input data for other models then it is possible in Łukasiewicz’s logic to interpret successfully other sophisticated logical structures. The obtained models allow a research of Łukasiewicz’s logic with specific effective methods of the network-flow programming. It is possible successfully to use the specific peculiarities and the results pertaining to the function ‘traffic capacity of the network arcs’. Based on the introduced network-flow approach it is possible to interpret other multivalue logics – of E.Post, of L.Brauer, of Kolmogorov, etc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Incorporating Material Balance Principle (MBP) in industrial and agricultural performance measurement systems with pollutant factors has been on the rise in recent years. Many conventional methods of performance measurement have proven incompatible with the material flow conditions. This study will address the issue of eco-efficiency measurement adjusted for pollution, taking into account materials flow conditions and the MBP requirements, in order to provide ‘real’ measures of performance that can serve as guides when making policies. We develop a new approach by integrating slacks-based measure to enhance the Malmquist Luenberger Index by a material balance condition that reflects the conservation of matter. This model is compared with a similar model, which incorporates MBP using the trade-off approach to measure productivity and eco-efficiency trends of power plants. Results reveal similar findings for both models substantiating robustness and applicability of the proposed model in this paper.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A cikk fókuszában az üzleti hálózatok állnak. Az IMP-csoport (Industrial Marketing and Purchasing Group) üzleti kapcsolatokkal, üzleti hálózatokkal foglalkozó több évtizedes kutatási eredményeire és fogalomrendszerére építve a szerzők áttekintik a téma alapvető megközelítéseit, majd a Versenyképesség-kutatás 2009. évi felmérésének adatait Felhasználva, megvizsgálják a vállalatvezetők által észlelt hálózati pozíció és a versenyképesség összefüggéseit. A vállalatvezetők értékelése szerint elemzik az iparági hálózatukban központi szerepet játszó, befolyással bíró (domináns hálózati pozícióval rendelkező) vállalatokat, különös tekintettel az üzleti teljesítmény és a versenyképesség jellemzőire. / === / The paper analyses the business network position of Hungarian companies, based on data of the Competitiveness research program. After an overview of the theoretical background of business relationships, business networks and network position – based on the IMP (Industrial Marketing and Purchasing Group) approach, the authors analyse the performance and competitiveness characteristics of firms with central position in their industrial network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tanulmányunk fókuszában az üzleti hálózatok állnak. Az IMP csoport (Industrial Marketing and Purchasing Group) üzleti kapcsolatokkal, üzleti hálózatokkal foglalkozó több évtizedes kutatási eredményeire és fogalomrendszerére alapozva áttekintjük a téma alapmegközelítéseit, majd a Versenyképesség-kutatás 2009. évei felmérése adatainak felhasználásával vizsgáljuk a hálózati kép és a versenyképesség összefüggéseit. Jellemezzük a vállaltvezetők értékelése szerint az iparági hálózatukban központi szerepet játszó, befolyással bíró (domináns hálózati pozícióval rendelkező) vállalatokat, különös tekintettel az üzleti teljesítmény és versenyképesség jellemzőire. ________ The paper analyses the business network position of Hungarian companies, based on data of the Competitiveness research program. After an overview of the theoretical background of business relationships, business networks and network position – based on the IMP (Industrial Marketing and Purchasing Group) approach, we analyse the performance and competitiveness characteristics of firms with central position in their industrial network.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of 3G (the 3rd generation telecommunication) value-added services brings higher requirements of Quality of Service (QoS). Wideband Code Division Multiple Access (WCDMA) is one of three 3G standards, and enhancement of QoS for WCDMA Core Network (CN) becomes more and more important for users and carriers. The dissertation focuses on enhancement of QoS for WCDMA CN. The purpose is to realize the DiffServ (Differentiated Services) model of QoS for WCDMA CN. Based on the parallelism characteristic of Network Processors (NPs), the NP programming model is classified as Pool of Threads (POTs) and Hyper Task Chaining (HTC). In this study, an integrated programming model that combines both of the two models was designed. This model has highly efficient and flexible features, and also solves the problems of sharing conflicts and packet ordering. We used this model as the programming model to realize DiffServ QoS for WCDMA CN. ^ The realization mechanism of the DiffServ model mainly consists of buffer management, packet scheduling and packet classification algorithms based on NPs. First, we proposed an adaptive buffer management algorithm called Packet Adaptive Fair Dropping (PAFD), which takes into consideration of both fairness and throughput, and has smooth service curves. Then, an improved packet scheduling algorithm called Priority-based Weighted Fair Queuing (PWFQ) was introduced to ensure the fairness of packet scheduling and reduce queue time of data packets. At the same time, the delay and jitter are also maintained in a small range. Thirdly, a multi-dimensional packet classification algorithm called Classification Based on Network Processors (CBNPs) was designed. It effectively reduces the memory access and storage space, and provides less time and space complexity. ^ Lastly, an integrated hardware and software system of the DiffServ model of QoS for WCDMA CN was proposed. It was implemented on the NP IXP2400. According to the corresponding experiment results, the proposed system significantly enhanced QoS for WCDMA CN. It extensively improves consistent response time, display distortion and sound image synchronization, and thus increases network efficiency and saves network resource.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1950s the global consumption of natural resources has skyrocketed, both in magnitude and in the range of resources used. Closely coupled with emissions of greenhouse gases, land consumption, pollution of environmental media, and degradation of ecosystems, as well as with economic development, increasing resource use is a key issue to be addressed in order to keep the planet Earth in a safe and just operating space. This requires thinking about absolute reductions in resource use and associated environmental impacts, and, when put in the context of current re-focusing on economic growth at the European level, absolute decoupling, i.e., maintaining economic development while absolutely reducing resource use and associated environmental impacts. Changing behavioural, institutional and organisational structures that lock-in unsustainable resource use is, thus, a formidable challenge as existing world views, social practices, infrastructures, as well as power structures, make initiating change difficult. Hence, policy mixes are needed that will target different drivers in a systematic way. When designing policy mixes for decoupling, the effect of individual instruments on other drivers and on other instruments in a mix should be considered and potential negative effects be mitigated. This requires smart and time-dynamic policy packaging. This Special Issue investigates the following research questions: What is decoupling and how does it relate to resource efficiency and environmental policy? How can we develop and realize policy mixes for decoupling economic development from resource use and associated environmental impacts? And how can we do this in a systemic way, so that all relevant dimensions and linkages—including across economic and social issues, such as production, consumption, transport, growth and wellbeing­—are taken into account? In addressing these questions, the overarching goals of this Special Issue are to: address the challenges related to more sustainable resource-use; contribute to the development of successful policy tools and practices for sustainable development and resource efficiency (particularly through the exploration of socio-economic, scientific, and integrated aspects of sustainable development); and inform policy debates and policy-making. The Special Issue draws on findings from the EU and other countries to offer lessons of international relevance for policy mixes for more sustainable resource-use, with findings of interest to policy makers in central and local government and NGOs, decision makers in business, academics, researchers, and scientists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Based on an original and comprehensive database of all feature fiction films produced in Mercosur between 2004 and 2012, the paper analyses whether the Mercosur film industry has evolved towards an integrated and culturally more diverse market. It provides a summary of policy opportunities in terms of integration and diversity, emphasizing the limiter role played by regional policies. It then shows that although the Mercosur film industry remains rather disintegrated, it tends to become more integrated and culturally more diverse. From a methodological point of view, the combination of Social Network Analysis and the Stirling Model opens up interesting research tracks to analyse creative industries in terms of their market integration and their cultural diversity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Network security monitoring remains a challenge. As global networks scale up, in terms of traffic, volume and speed, effective attribution of cyber attacks is increasingly difficult. The problem is compounded by a combination of other factors, including the architecture of the Internet, multi-stage attacks and increasing volumes of nonproductive traffic. This paper proposes to shift the focus of security monitoring from the source to the target. Simply put, resources devoted to detection and attribution should be redeployed to efficiently monitor for targeting and prevention of attacks. The effort of detection should aim to determine whether a node is under attack, and if so, effectively prevent the attack. This paper contributes by systematically reviewing the structural, operational and legal reasons underlying this argument, and presents empirical evidence to support a shift away from attribution to favour of a target-centric monitoring approach. A carefully deployed set of experiments are presented and a detailed analysis of the results is achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.