919 resultados para Many-to-many-assignment problem
Resumo:
Conventionally, document classification researches focus on improving the learning capabilities of classifiers. Nevertheless, according to our observation, the effectiveness of classification is limited by the suitability of document representation. Intuitively, the more features that are used in representation, the more comprehensive that documents are represented. However, if a representation contains too many irrelevant features, the classifier would suffer from not only the curse of high dimensionality, but also overfitting. To address this problem of suitableness of document representations, we present a classifier-independent approach to measure the effectiveness of document representations. Our approach utilises a labelled document corpus to estimate the distribution of documents in the feature space. By looking through documents in this way, we can clearly identify the contributions made by different features toward the document classification. Some experiments have been performed to show how the effectiveness is evaluated. Our approach can be used as a tool to assist feature selection, dimensionality reduction and document classification.
Resumo:
Spatial data mining recently emerges from a number of real applications, such as real-estate marketing, urban planning, weather forecasting, medical image analysis, road traffic accident analysis, etc. It demands for efficient solutions for many new, expensive, and complicated problems. In this paper, we investigate the problem of evaluating the top k distinguished “features” for a “cluster” based on weighted proximity relationships between the cluster and features. We measure proximity in an average fashion to address possible nonuniform data distribution in a cluster. Combining a standard multi-step paradigm with new lower and upper proximity bounds, we presented an efficient algorithm to solve the problem. The algorithm is implemented in several different modes. Our experiment results not only give a comparison among them but also illustrate the efficiency of the algorithm.
Resumo:
Physical distribution plays an imporant role in contemporary logistics management. Both satisfaction level of of customer and competitiveness of company can be enhanced if the distribution problem is solved optimally. The multi-depot vehicle routing problem (MDVRP) belongs to a practical logistics distribution problem, which consists of three critical issues: customer assignment, customer routing, and vehicle sequencing. According to the literatures, the solution approaches for the MDVRP are not satisfactory because some unrealistic assumptions were made on the first sub-problem of the MDVRP, ot the customer assignment problem. To refine the approaches, the focus of this paper is confined to this problem only. This paper formulates the customer assignment problem as a minimax-type integer linear programming model with the objective of minimizing the cycle time of the depots where setup times are explicitly considered. Since the model is proven to be MP-complete, a genetic algorithm is developed for solving the problem. The efficiency and effectiveness of the genetic algorithm are illustrated by a numerical example.
Resumo:
Data Envelopment Analysis (DEA) is one of the most widely used methods in the measurement of the efficiency and productivity of Decision Making Units (DMUs). DEA for a large dataset with many inputs/outputs would require huge computer resources in terms of memory and CPU time. This paper proposes a neural network back-propagation Data Envelopment Analysis to address this problem for the very large scale datasets now emerging in practice. Neural network requirements for computer memory and CPU time are far less than that needed by conventional DEA methods and can therefore be a useful tool in measuring the efficiency of large datasets. Finally, the back-propagation DEA algorithm is applied to five large datasets and compared with the results obtained by conventional DEA.
Resumo:
Purpose – This paper sets out to study a production-planning problem for printed circuit board (PCB) assembly. A PCB assembly company may have a number of assembly lines for production of several product types in large volume. Design/methodology/approach – Pure integer linear programming models are formulated for assigning the product types to assembly lines, which is the line assignment problem, with the objective of minimizing the total production cost. In this approach, unrealistic assignment, which was suffered by previous researchers, is avoided by incorporating several constraints into the model. In this paper, a genetic algorithm is developed to solve the line assignment problem. Findings – The procedure of the genetic algorithm to the problem and a numerical example for illustrating the models are provided. It is also proved that the algorithm is effective and efficient in dealing with the problem. Originality/value – This paper studies the line assignment problem arising in a PCB manufacturing company in which the production volume is high.
Resumo:
Purpose – Increasing turnover of frontline staff in call centres is detrimental to the delivery of quality service to customers. This paper aims to present the context for the rapid growth of the business process outsourcing (BPO) sector in India, and to address a critical issue faced by call centre organisations in this sector – the high employee turnover. Design/methodology/approach – Following a triangulation approach, two separate empirical investigations are conducted to examine various aspects of high labour turnover rates in the call centre sector in India. Study one examines the research issue via 51 in-depth interviews in as many units. Study two reports results from a questionnaire survey with 204 frontline agents across 11 call centres regarding employee turnover. Findings – This research reveals a range of reasons – from monotonous work, stressful work environment, adverse working conditions, lack of career development opportunities; to better job opportunities elsewhere, which emerge as the key causes of increasing attrition rates in the Indian call centre industry. Research limitations/implications – The research suggests that there are several issues that need to be handled carefully by management of call centres in India to overcome the problem of increasing employee turnover, and that this also demands support from the Indian government. Originality/value – The contributions of this study untangle the issues underlying a key problem in the call centre industry, i.e. employee turnover in the Indian call centre industry context. Adopting an internal marketing approach, it provides useful information for both academics and practitioners and suggests internal marketing interventions, and avenues for future research to combat the problem of employee turnover.
Resumo:
The aim of the investigation was to study the problem of colonization of shipboard fuel systems and to examine the effect of a number of environmental factors on microbial growth and survival in order to find potential preservative treatments. A variety of microbial species were isolated from samples taken from fuel storage tanks. Bacteria were more numerous than yeasts or fungi and most microorganisms were found at the fuel/water interface. 1he salinity, pH and phosphate concentration of some water bottoms were characteristic of sea water. Others were brackish, acidic and varied in phosphate content. Microorganisms were cultured under a number of environmental conditions. After prolonged incubation, the inoculum size had no effect on the final biomass of Cladosporium resinae but the time required to achieve the final mass decreased with increasing spore number. Undecane supported better growth of the fungus than diesel fuel and of four types of diesel fuel, two allowed more profuse growth. With sea water as the aqueous phase, a number of isolates were inhibited but the addition of nutrients allowed the development of many of the organisms. Agitation increased the growth of C. resinae on glucose but inhibited it on hydrocarbons. The optimum temperature fgr growth of C. resinae on surface culture lay between 25º C and 30º C and growth was evident at 5º C but not at 45º C. In aqueous suspension, 90% of spores were inactivated in around 60 hours at 45ºC and the same proportion of spores of C. resinae and Penicillium corylophilum were destroyed after about 30 seconds at 65ºC. The majority of bacteria and all yeasts in a water bottom sample were killed within 10 seconds at this temperature. An increase in the concentration of an organo-boron compound caused more rapid inactivation of C. resinae spores and raising the temperature from 25ºC to 45°C significantly enhanced the potency of the biocide.
Resumo:
Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.
Resumo:
In recent years the optical domain has been traditionally reserved for node-to-node transmission with the processing and switching achieved entirely in the electrical domain. However, with the constantly increasing demand for bandwidth and the resultant increase in transmission speeds, there is a very real fear that current electronic technology as used for processing will not be able to cope with future demands. Fuelled by this requirement for faster processing speeds, considerable research is currently being carried out into the potential of All-optical processing. One of the fundamental obstacles in realising All-optical processing is the requirement for All-optical buffering. Without all-optical buffers it is extremely difficult to resolve situations such as contention and congestion. Many devices have been proposed to solve this problem however none of them provide the perfect solution. The subject of this research is to experimentally demonstrate a novel all-optical memory device. Unlike many previously demonstrated optical storage devices the device under consideration utilises only a single loop mirror and a single SOA as its switch, whilst providing full regenerative capabilities required for long-term storage. I will explain some of the principles and characteristics of the device, which will then be experimentally demonstrated. The device configuration will then be studied and investigated as to its suitability for Hybrid Integrated Technology.
Resumo:
The activities of many mammalian membrane proteins including G-protein coupled receptors are cholesterol-dependent. Unlike higher eukaryotes, yeast do not make cholesterol. Rather they make a related molecule called ergosterol. As cholesterol and ergosterol are biologically non-equivalent, the potential of yeast as hosts for overproducing mammalian membrane proteins has never been fully realised. To address this problem, we are trying to engineer a novel strain of Saccharomyces cerevisiae in which the cholesterol biosynthetic pathway of mammalian cells has been fully reconstituted. Thus far, we have created a modified strain that makes cholesterol-like sterols which has an increased capacity to make G-protein coupled receptors compared to control yeast.
Resumo:
A nature inspired decentralised multi-agent algorithm is proposed to solve a problem of distributed task selection in which cities produce and store batches of different mail types. Agents must collect and process the mail batches, without a priori knowledge of the available mail at the cities or inter-agent communication. In order to process a different mail type than the previous one, agents must undergo a change-over during which it remains inactive. We propose a threshold based algorithm in order to maximise the overall efficiency (the average amount of mail collected). We show that memory, i.e. the possibility for agents to develop preferences for certain cities, not only leads to emergent cooperation between agents, but also to a significant increase in efficiency (above the theoretical upper limit for any memoryless algorithm), and we systematically investigate the influence of the various model parameters. Finally, we demonstrate the flexibility of the algorithm to changes in circumstances, and its excellent scalability.
Resumo:
In many real applications of Data Envelopment Analysis (DEA), the decision makers have to deteriorate some inputs and some outputs. This could be because of limitation of funds available. This paper proposes a new DEA-based approach to determine highest possible reduction in the concern input variables and lowest possible deterioration in the concern output variables without reducing the efficiency in any DMU. A numerical example is used to illustrate the problem. An application in banking sector with limitation of IT investment shows the usefulness of the proposed method. © 2010 Elsevier Ltd. All rights reserved.
Resumo:
Completing projects faster than the normal duration is always a challenge to the management of any project, as it often demands many paradigm shifts. Opportunities of globalization, competition from private sectors and multinationals force the management of public sector organizations in the Indian petroleum sector to take various aggressive strategies to maintain their profitability. Constructing infrastructure for handling petroleum products is one of them. Moreover, these projects are required to be completed in faster duration compared to normal schedules to remain competitive, to get faster return on investment, and to give longer project life. However, using conventional tools and techniques of project management, it is impossible to handle the problem of reducing the project duration from a normal period. This study proposes the use of concurrent engineering in managing projects for radically reducing project duration. The phases of the project are accomplished concurrently/simultaneously instead of in a series. The complexities that arise in managing projects are tackled through restructuring project organization, improving management commitment, strengthening project-planning activities, ensuring project quality, managing project risk objectively and integrating project activities through management information systems. These would not only ensure completion of projects in fast track, but also improve project effectiveness in terms of quality, cost effectiveness, team building, etc. and in turn overall productivity of the project organization would improve.
An improved conflicting evidence combination approach based on a new supporting probability distance
Resumo:
To avoid counter-intuitive result of classical Dempster's combination rule when dealing with highly conflict information, many improved combination methods have been developed through modifying the basic probability assignments (BPAs) of bodies of evidence (BOEs) by using a certain measure of the degree of conflict or uncertain information, such as Jousselme's distance, the pignistic probability distance and the ambiguity measure. However, if BOEs contain some non-singleton elements and the differences among their BPAs are larger than 0.5, the current conflict measure methods have limitations in describing the interrelationship among the conflict BOEs and may even lead to wrong combination results. In order to solve this problem, a new distance function, which is called supporting probability distance, is proposed to characterize the differences among BOEs. With the new distance, the information of how much a focal element is supported by the other focal elements in BOEs can be given. Also, a new combination rule based on the supporting probability distance is proposed for the combination of the conflicting evidences. The credibility and the discounting factor of each BOE are generated by the supporting probability distance and the weighted BOEs are combined directly using Dempster's rules. Analytical results of numerical examples show that the new distance has a better capability of describing the interrelationships among BOEs, especially for the highly conflicting BOEs containing non-singleton elements and the proposed new combination method has better applicability and effectiveness compared with the existing methods.
Resumo:
This article reports on an investigationwith first year undergraduate ProductDesign and Management students within a School of Engineering and Applied Science. The students at the time of this investigation had studied fundamental engineering science and mathematics for one semester. The students were given an open ended, ill-formed problem which involved designing a simple bridge to cross a river.They were given a talk on problemsolving and given a rubric to follow, if they chose to do so.They were not given any formulae or procedures needed in order to resolve the problem. In theory, they possessed the knowledge to ask the right questions in order tomake assumptions but, in practice, it turned out they were unable to link their a priori knowledge to resolve this problem. They were able to solve simple beam problems when given closed questions. The results show they were unable to visualize a simple bridge as an augmented beam problem and ask pertinent questions and hence formulate appropriate assumptions in order to offer resolutions.