984 resultados para Commercial distribution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Logistics distribution network design is one of the major decision problems arising in contemporary supply chain management. The decision involves many quantitative and qualitative factors that may be conflicting in nature. This paper applies an integrated multiple criteria decision making approach to design an optimal distribution network. In the approach, the analytic hierarchy process (AHP) is used first to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, the goal programming (GP) model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. In this paper, two commercial packages are used: Expert Choice for determining the AHP priorities of the warehouses, and LINDO for solving the GP model. © 2007 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Financial institutes are an integral part of any modern economy. In the 1970s and 1980s, Gulf Cooperation Council (GCC) countries made significant progress in financial deepening and in building a modern financial infrastructure. This study aims to evaluate the performance (efficiency) of financial institutes (banking sector) in GCC countries. Since, the selected variables include negative data for some banks and positive for others, and the available evaluation methods are not helpful in this case, so we developed a Semi Oriented Radial Model to perform this evaluation. Furthermore, since the SORM evaluation result provides a limited information for any decision maker (bankers, investors, etc...), we proposed a second stage analysis using classification and regression (C&R) method to get further results combining SORM results with other environmental data (Financial, economical and political) to set rules for the efficient banks, hence, the results will be useful for bankers in order to improve their bank performance and to the investors, maximize their returns. Mainly there are two approaches to evaluate the performance of Decision Making Units (DMUs), under each of them there are different methods with different assumptions. Parametric approach is based on the econometric regression theory and nonparametric approach is based on a mathematical linear programming theory. Under the nonparametric approaches, there are two methods: Data Envelopment Analysis (DEA) and Free Disposal Hull (FDH). While there are three methods under the parametric approach: Stochastic Frontier Analysis (SFA); Thick Frontier Analysis (TFA) and Distribution-Free Analysis (DFA). The result shows that DEA and SFA are the most applicable methods in banking sector, but DEA is seem to be most popular between researchers. However DEA as SFA still facing many challenges, one of these challenges is how to deal with negative data, since it requires the assumption that all the input and output values are non-negative, while in many applications negative outputs could appear e.g. losses in contrast with profit. Although there are few developed Models under DEA to deal with negative data but we believe that each of them has it is own limitations, therefore we developed a Semi-Oriented-Radial-Model (SORM) that could handle the negativity issue in DEA. The application result using SORM shows that the overall performance of GCC banking is relatively high (85.6%). Although, the efficiency score is fluctuated over the study period (1998-2007) due to the second Gulf War and to the international financial crisis, but still higher than the efficiency score of their counterpart in other countries. Banks operating in Saudi Arabia seem to be the highest efficient banks followed by UAE, Omani and Bahraini banks, while banks operating in Qatar and Kuwait seem to be the lowest efficient banks; this is because these two countries are the most affected country in the second Gulf War. Also, the result shows that there is no statistical relationship between the operating style (Islamic or Conventional) and bank efficiency. Even though there is no statistical differences due to the operational style, but Islamic bank seem to be more efficient than the Conventional bank, since on average their efficiency score is 86.33% compare to 85.38% for Conventional banks. Furthermore, the Islamic banks seem to be more affected by the political crisis (second Gulf War), whereas Conventional banks seem to be more affected by the financial crisis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the contemporary customer-driven supply chain, maximization of customer service plays an equally important role as minimization of costs for a company to retain and increase its competitiveness. This article develops a multiple-criteria optimization approach, combining the analytic hierarchy process (AHP) and an integer linear programming (ILP) model, to aid the design of an optimal logistics distribution network. The proposed approach outperforms traditional cost-based optimization techniques because it considers both quantitative and qualitative factors and also aims at maximizing the benefits of deliverer and customers. In the approach, the AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to some critical customer-oriented criteria. The results of AHP prioritization are utilized as the input of the ILP model, the objective of which is to select the best warehouses at the lowest possible cost. In this article, two commercial packages are used: including Expert Choice and LINDO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current technology permits connecting local networks via high-bandwidth telephone lines. Central coordinator nodes may use Intelligent Networks to manage data flow over dialed data lines, e.g. ISDN, and to establish connections between LANs. This dissertation focuses on cost minimization and on establishing operational policies for query distribution over heterogeneous, geographically distributed databases. Based on our study of query distribution strategies, public network tariff policies, and database interface standards we propose methods for communication cost estimation, strategies for the reduction of bandwidth allocation, and guidelines for central to node communication protocols. Our conclusion is that dialed data lines offer a cost effective alternative for the implementation of distributed database query systems, and that existing commercial software may be adapted to support query processing in heterogeneous distributed database systems. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Date of Acceptance: 02/03/2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Date of Acceptance: 02/03/2015

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors would like to thank the College of Life Sciences of Aberdeen University and Marine Scotland Science which funded CP's PhD project. Skate tagging experiments were undertaken as part of Scottish Government project SP004. We thank Ian Burrett for help in catching the fish and the other fishermen and anglers who returned tags. We thank José Manuel Gonzalez-Irusta for extracting and making available the environmental layers used as environmental covariates in the environmental suitability modelling procedure. We also thank Jason Matthiopoulos for insightful suggestions on habitat utilization metrics as well as Stephen C.F. Palmer, and three anonymous reviewers for useful suggestions to improve the clarity and quality of the manuscript.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Grant support This study was supported by an award (Ref: WHMSB-AU119) from the Translational Medicine Research Collaboration – a consortium made up of the Universities of Aberdeen, Dundee, Edinburgh and Glasgow, the four associated NHS Health Boards (Grampian, Tayside, Lothian and Greater Glasgow & Clyde), Scottish Enterprise and Wyeth. The funder played no part in the design, execution, analysis or publication of this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgements. This study was supported by the FP7-PEOPLE-2013-IEF Marie-Curie Action – SPATFOREST. Tree data from BCI were provided by the Center for Tropical Forest Science of the Smithsonian Tropical Research Institute and the primary granting agencies that have supported the BCI plot tree census. Data for the liana censuses were supported by the US National Science Foundation grants: DEB-0613666, DEB-0845071, and DEB-1019436 (to SAS). Soil data was funded by the National Science Foundation grants DEB021104, DEB021115, DEB0212284 and DEB0212818 supporting soils mapping in the BCI plot. We thank Helene Muller-Landau for providing some data on tree height for some BCI trees. We also thank all the people that contributed to obtain the data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Sahara Desert is the largest source of mineral dust in the world. Emissions of African dust increased sharply in the early 1970s, a change that has been attributed mainly to drought in the Sahara/Sahel region caused by changes in the global distribution of sea surface temperature. The human contribution to land degradation and dust mobilization in this region remains poorly understood, owing to the paucity of data that would allow the identification of long-term trends in desertification. Direct measurements of airborne African dust concentrations only became available in the mid-1960s from a station on Barbados and subsequently from satellite imagery since the late 1970s: they do not cover the onset of commercial agriculture in the Sahel region ~170 years ago. Here we construct a 3,200-year record of dust deposition off northwest Africa by investigating the chemistry and grain-size distribution of terrigenous sediments deposited at a marine site located directly under the West African dust plume. With the help of our dust record and a proxy record for West African precipitation we find that, on the century scale, dust deposition is related to precipitation in tropical West Africa until the seventeenth century. At the beginning of the nineteenth century, a sharp increase in dust deposition parallels the advent of commercial agriculture in the Sahel region. Our findings suggest that human-induced dust emissions from the Sahel region have contributed to the atmospheric dust load for about 200 years.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European CloudSME project that incorporated 24 European SMEs, besides five academic partners, has finished its funded phase in March 2016. This presentation will provide a summary of the results of the project, and will analyze the challenges and differences when developing “SME Gateways”, when compared to “Science Gateways”. CloudSME started in 2013 with the aim to develop a cloud-based simulation platform for manufacturing and engineering SMEs. The project was based around industry use-cases, five of which were incorporated in the project from the start, and seven additional ones that were added as an outcome of an open call in January 2015. CloudSME utilized science gateway related technologies, such as the commercial CloudBroker Platform and the WS-PGRADE/gUSE Gateway Framework that were developed in the preceding SCI-BUS project. As most important outcome, the project successfully implemented 12 industry quality demonstrators that showcase how SMEs in the manufacturing and engineering sector can utilize cloud-based simulation services. Some of these solutions are already market-ready and currently being rolled out by the software vendor companies. Some others require further fine-tuning and the implementation of commercial interfaces before being put into the market. The CloudSME use-cases came from a very wide application spectrum. The project implemented, for example, an open marketplace for micro-breweries to optimize their production and distribution processes, an insole design validation service to be used by podiatrists and shoe manufacturers, a generic stock management solution for manufacturing SMEs, and also several “classical” high-performance computing case-studies, such as fluid dynamics simulations for model helicopter design, and dual-fuel internal combustion engine simulation. As the project generated significant impact and interest in the manufacturing sector, 10 CloudSME stakeholders established a follow-up company called CloudSME UG for the future commercialization of the results. Besides the success stories, this talk would also like to highlight the difficulties when transferring the outcomes of an academic research project to real commercial applications. The different mindset and approach of academic and industry partners presented a real challenge for the CloudSME project, with some interesting and valuable lessons learnt. The academic way of supporting SMEs did not always work well with the rather different working practices and culture of many participants. Also, the quality of support regarding operational solutions required by the SMEs is well beyond the typical support services academic institutions are prepared for. Finally, a clear lack of trust in academic solutions when compared to commercial solutions was also imminent. The talk will highlight some of these challenges underpinned by the implementation of the CloudSME use-cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The blast furnace is the main ironmaking production unit in the world which converts iron ore with coke and hot blast into liquid iron, hot metal, which is used for steelmaking. The furnace acts as a counter-current reactor charged with layers of raw material of very different gas permeability. The arrangement of these layers, or burden distribution, is the most important factor influencing the gas flow conditions inside the furnace, which dictate the efficiency of the heat transfer and reduction processes. For proper control the furnace operators should know the overall conditions in the furnace and be able to predict how control actions affect the state of the furnace. However, due to high temperatures and pressure, hostile atmosphere and mechanical wear it is very difficult to measure internal variables. Instead, the operators have to rely extensively on measurements obtained at the boundaries of the furnace and make their decisions on the basis of heuristic rules and results from mathematical models. It is particularly difficult to understand the distribution of the burden materials because of the complex behavior of the particulate materials during charging. The aim of this doctoral thesis is to clarify some aspects of burden distribution and to develop tools that can aid the decision-making process in the control of the burden and gas distribution in the blast furnace. A relatively simple mathematical model was created for simulation of the distribution of the burden material with a bell-less top charging system. The model developed is fast and it can therefore be used by the operators to gain understanding of the formation of layers for different charging programs. The results were verified by findings from charging experiments using a small-scale charging rig at the laboratory. A basic gas flow model was developed which utilized the results of the burden distribution model to estimate the gas permeability of the upper part of the blast furnace. This combined formulation for gas and burden distribution made it possible to implement a search for the best combination of charging parameters to achieve a target gas temperature distribution. As this mathematical task is discontinuous and non-differentiable, a genetic algorithm was applied to solve the optimization problem. It was demonstrated that the method was able to evolve optimal charging programs that fulfilled the target conditions. Even though the burden distribution model provides information about the layer structure, it neglects some effects which influence the results, such as mixed layer formation and coke collapse. A more accurate numerical method for studying particle mechanics, the Discrete Element Method (DEM), was used to study some aspects of the charging process more closely. Model charging programs were simulated using DEM and compared with the results from small-scale experiments. The mixed layer was defined and the voidage of mixed layers was estimated. The mixed layer was found to have about 12% less voidage than layers of the individual burden components. Finally, a model for predicting the extent of coke collapse when heavier pellets are charged over a layer of lighter coke particles was formulated based on slope stability theory, and was used to update the coke layer distribution after charging in the mathematical model. In designing this revision, results from DEM simulations and charging experiments for some charging programs were used. The findings from the coke collapse analysis can be used to design charging programs with more stable coke layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: To investigate the effect of implant-abutment angulation and crown material on stress distribution of central incisors. Finite element method was used to simulate the clinical situation of a maxillary right central incisor restored by two different implant-abutment angulations, 15° and 25°, using two different crown materials (IPS E-Max CAD and zirconia). Methods: Two 3D finite element models were specially prepared for this research simulating the abutment angulations. Commercial engineering CAD/CAM package was used to model crown, implant abutment complex and bone (cortical and spongy) in 3D. Linear static analysis was performed by applying a 178 N oblique load. The obtained results were compared with former experimental results. Results: Implant Von Mises stress level was negligibly changed with increasing abutment angulation. The abutment with higher angulation is mechanically weaker and expected to fail at lower loading in comparison with the steeper one. Similarly, screw used with abutment angulation of 25° will fail at lower (about one-third) load value the failure load of similar screw used with abutment angulated by 15°. Conclusions: Bone (cortical and spongy) is insensitive to crown material. Increasing abutment angulation from 15° to 25°, increases stress on cortical bone by about 20% and reduces it by about 12% on spongy bone. Crown fracture resistance is dramatically reduced by increasing abutment angulation. Zirconia crown showed better performance than E-Max one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The market for plant-based dairy-type products is growing as consumers replace bovine milk in their diet, for medical reasons or as a lifestyle choice. A screening of 17 different commercial plant-based milk substitutes based on different cereals, nuts and legumes was performed, including the evaluation of physicochemical and glycaemic properties. Half of the analysed samples had low or no protein contents (<0.5 %). Only samples based on soya showed considerable high protein contents, matching the value of cow’s milk (3.7 %). An in-vitro method was used to predict the glycaemic index. In general, the glycaemic index values ranged from 47 for bovine milk to 64 (almond-based) and up to 100 for rice-based samples. Most of the plant-based milk substitutes were highly unstable with separation rates up to 54.39 %/h. This study demonstrated that nutritional and physicochemical properties of plant-based milk substitutes are strongly dependent on the plant source, processing and fortification. Most products showed low nutritional qualities. Therefore, consumer awareness is important when plant-based milk substitutes are used as an alternative to cow’s milk in the diet.