889 resultados para mining equipment technology services


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Healthcare organisations are increasingly being challenged to look at their operations and find opportunities to improve the quality, efficiency and effectiveness of their supply chain services. In light of this situation, there is an apparent need for healthcare organisations to invest in integration technologies and to achieve the integration of supply chain processes, in order to break up the historical structure characterised by numerous interfaces and the segregation of responsibilities. The aim of this paper is to take an independent look at the healthcare supply chain and identify at different levels the core entities, processes, information flows, and system integration challenges which impede supply chain quality improvements to be realised. Moreover, this paper proposes, from an information systems perspective, a framework for the evaluation of different integration technology approaches, which can be used as a potential guideline tool for assessing integration technology alternatives, in order to add value to a healthcare-supply-chain management system. Copyright © 2007 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Returnable transport equipment (RTE) such as pallets form an integral part of the supply chain and poor management leads to costly losses. Companies often address this matter by outsourcing the management of RTE to logistics service providers (LSPs). LSPs are faced with the task to provide logistical expertise to reduce RTE related waste, whilst differentiating their own services to remain competitive. In the current challenging economic climate, the role of the LSP to deliver innovative ways to achieve competitive advantage has never been so important. It is reported that radio frequency identification (RFID) application to RTE enables LSPs such as DHL to gain competitive advantage and offer clients improvements such as loss reduction, process efficiency improvement and effective security. However, the increased visibility and functionality of RFID enabled RTE requires further investigation in regards to decision‐making. The distributed nature of the RTE network favours a decentralised decision‐making format. Agents are an effective way to represent objects from the bottom‐up, capturing the behaviour and enabling localised decision‐making. Therefore, an agent based system is proposed to represent the RTE network and utilise the visibility and data gathered from RFID tags. Two types of agents are developed in order to represent the trucks and RTE, which have bespoke rules and algorithms in order to facilitate negotiations. The aim is to create schedules, which integrate RTE pick‐ups as the trucks go back to the depot. The findings assert that: - agent based modelling provides an autonomous tool, which is effective in modelling RFID enabled RTE in a decentralised utilising the real‐time data facility. ‐ the RFID enabled RTE model developed enables autonomous agent interaction, which leads to a feasible schedule integrating both forward and reverse flows for each RTE batch. ‐ the RTE agent scheduling algorithm developed promotes the utilisation of RTE by including an automatic return flow for each batch of RTE, whilst considering the fleet costs andutilisation rates. ‐ the research conducted contributes an agent based platform, which LSPs can use in order to assess the most appropriate strategies to implement for RTE network improvement for each of their clients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Back in 2003, we published ‘MAX’ randomisation, a process of non-degenerate saturation mutagenesis using exactly 20 codons (one for each amino acid) or else any required subset of those 20 codons. ‘MAX’ randomisation saturates codons located in isolated positions within a protein, as might be required in enzyme engineering, or else on one face of an alpha-helix, as in zinc finger engineering. Since that time, we have been asked for an equivalent process that can saturate multiple, contiguous codons in a non-degenerate manner. We have now developed ‘ProxiMAX’ randomisation, which does just that: generating DNA cassettes for saturation mutagenesis without degeneracy or bias. Offering an alternative to trinucleotide phosphoramidite chemistry, ProxiMAX randomisation uses nothing more sophisticated than unmodified oligonucleotides and standard molecular biology reagents. Thus it requires no specialised chemistry, reagents nor equipment and simply relies on a process of saturation cycling comprising ligation, amplification and digestion for each cycle. The process can encode both unbiased representation of selected amino acids or else encode them in pre-defined ratios. Each saturated position can be defined independently of the others. We demonstrate accurate saturation of up to 11 contiguous codons. As such, ProxiMAX randomisation is particularly relevant to antibody engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study the management and control of service differentiation and guarantee based on enhanced distributed function coordination (EDCF) in IEEE 802.11e wireless LANs. Backoff-based priority schemes are the major mechanism for Quality of Service (QoS) provisioning in EDCF. However, control and management of the backoff-based priority scheme are still challenging problems. We have analysed the impacts of backoff and Inter-frame Space (IFS) parameters of EDCF on saturation throughput and service differentiation. A centralised QoS management and control scheme is proposed. The configuration of backoff parameters and admission control are studied in the management scheme. The special role of access point (AP) and the impact of traffic load are also considered in the scheme. The backoff parameters are adaptively re-configured to increase the levels of bandwidth guarantee and fairness on sharing bandwidth. The proposed management scheme is evaluated by OPNET. Simulation results show the effectiveness of the analytical model based admission control scheme. ©2005 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fierce competition within the third party logistics (3PL) market has developed as providers compete to win customers and enhance their competitive advantage through cost reduction plans and creating service differentiation. 3PL providers are expected to develop advanced technological and logistical service applications that can support cost reduction while increasing service innovation. To enhance competitiveness, this paper proposes the implementation of radio-frequency identification (RFID) enabled returnable transport equipment (RTE) in combination with the consolidation of network assets and cross-docking. RFID enabled RTE can significantly improve network visibility of all assets with continuous real-time data updates. A four-level cyclic model aiding 3PL providers to achieve competitive advantage has been developed. The focus is to reduce assets, increase asset utilisation, reduce RTE cycle time and introduce real-time data in the 3PL network. Furthermore, this paper highlights the need for further research from the 3PL perspective. Copyright © 2013 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included. © 2010 The authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of emerging markets’ multinational companies (MNCs) is a recent phenomenon and, as such, their nature and structure of key management processes, functions, and roles need further examination. While an abundance of low-cost labor is often the starting point of competitive advantage for many of the emerging markets’ MNCs, it is the optimum configuration of people, processes, and technology that defines how they leverage their intangible resources. Based on case studies of four Indian IT services MNCs, involving 51 in-depth interviews of business and human resource (HR) leaders at the corporate and subsidiary levels, we identify five key HR roles—namely, strategic business partner, guardian of culture, builder of global workforce and capabilities, champion of processes, and facilitator of employee development. The analysis also highlights that the HR function in Indian IT service MNCs faces several challenges in consolidating the early gains of internationalization, such as lack of decentralized decision making, developing a global mind-set, localization of the workforce, and developing a global leadership pipeline. Based on our exploratory findings, we propose a framework outlining the global HR roles pursued by emerging IT services MNCs, the factors influencing them, and the challenges facing their HR function for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As we settle into a new year, this second issue of Contact Lens and Anterior Eye allows us to reflect on how new research in this field impacts our understanding, but more importantly, how we use this evidence basis to enhance our day to day practice, to educate the next generation of students and to construct the research studies to deepen our knowledge still further. The end of 2014 saw the publication of the UK governments Research Exercise Framework (REF) which ranks Universities in terms of their outputs (which includes their paper, publications and research income), environment (infrastructure and staff support) and for the first time impact (defined as “any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia” [8]). The REF is a process of expert review, carried out in 36 subject-based units of assessment, of which our field is typically submitted to the Allied Health, Dentistry, Nursing and Pharmacy panel. Universities that offer Optometry did very well with Cardiff, Manchester and Aston in the top 10% out of the 94 Universities that submitted to this panel (Grade point Average ranked order). While the format of the new exercise (probably in 2010) to allocate the more than £2 billion of UK government research funds is yet to be determined, it is already rumoured that impact will contribute an even larger proportion to the weighting. Hence it is even more important to reflect on the impact of our research. In this issue, Elisseef and colleagues [5] examine the intriguing potential of modifying a lens surface to allow it to bind to known wetting agents (in this case hyaluronic acid) to enhance water retention. Such a technique has the capacity to reduced friction between the lens surface and the eyelids/ocular surface, presumably leading to higher comfort and less reason for patients to discontinue with lens wear. Several papers in this issue report on the validity of new high precision, fast scanning imaging and quantification equipment, utilising techniques such as Scheimpflug, partial coherence interferometry, aberrometry and video allowing detailed assessment of anterior chamber biometry, corneal topography, corneal biomechanics, peripheral refraction, ocular aberrations and lens fit. The challenge is how to use this advanced instrumentation which is becoming increasingly available to create real impact. Many challenges in contact lenses and the anterior eye still prevail in 2015 such as: -While contact lens and refractive surgery complications are relatively rare, they are still too often devastating to the individual and their quality of life (such as the impact and prognosis of patients with Acanthmoeba Keratitis reported by Jhanji and colleagues in this issue [7]). How can we detect those patients who are going to be affected and what modifications do we need to make to contact lenses and patient management prevent this occurring? -Drop out from contact lenses still occurs at a rapid rate and symptoms of dry eye seem to be the leading cause driving this discontinuation of wear [1] and [2]. What design, coating, material and lubricant release mechanism will make a step change in end of day comfort in particular? -Presbyopia is a major challenge to hassle free quality vision and is one of the first signs of ageing noticed by many people. As an emmetrope approaching presbyopia, I have a vested interest in new medical devices that will give me high quality vision at all distances when my arms won’t stretch any further. Perhaps a new definition of presbyopia could be when you start to orientate your smartphone in the landscape direction to gain the small increase in print size needed to read! Effective accommodating intraocular lenses that truly mimic the pre-presbyopic crystalline lenses are still a way off [3] and hence simultaneous images achieved through contact lenses, intraocular lenses or refractive surgery still have a secure future. However, splitting light reaching the retina and requiring the brain to supress blurred images will always be a compromise on contrast sensitivity and is liable to cause dysphotopsia; so how will new designs account for differences in a patient's task demands and own optical aberrations to allow focused patient selection, optimising satisfaction? -Drug delivery from contact lenses offers much in terms of compliance and quality of life for patients with chronic ocular conditions such as glaucoma, dry eye and perhaps in the future, dry age-related macular degeneration; but scientific proof-of-concept publications (see EIShaer et al. [6]) have not yet led to commercial products. Part of this is presumably the regulatory complexity of combining a medical device (the contact lens) and a pharmaceutical agent. Will 2015 be the year when this innovation finally becomes a reality for patients, bringing them an enhanced quality of life through their eye care practitioners and allowing researchers to further validate the use of pharmaceutical contact lenses and propose enhancements as the technology matures? -Last, but no means least is the field of myopia control, the topic of the first day of the BCLA's Conference in Liverpool, June 6–9th 2015. The epidemic of myopia is a blight, particularly in Asia, with significant concerns over sight threatening pathology resulting from the elongated eye. This is a field where real impact is already being realised through new soft contact lens optics, orthokeratology and low dose pharmaceuticals [4], but we still need to be able to better predict which technique will work best for an individual and to develop new techniques to retard myopia progression in those who don’t respond to current treatments, without increasing their risk of complications or the treatment impacting their quality of life So what will your New Year's resolution be to make 2015 a year of real impact, whether by advancing science or applying the findings published in journals such as Contact Lens and Anterior Eye to make a real difference to your patients’ lives?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology of classification of electronic documents based on the theory of disturbance of pseudoinverse matrices was proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the role of entrepreneurs' general and specific human capital on the performance of UK new technology based firms using a resource based approach to the entrepreneurship theory. The effect of entrepreneurial human capital on the performance of NTBFs is investigated using data derived from a survey of 412 firms operating in both high-tech manufacturing and the services sectors. According to the resource based theory it is found that specific human capital is more important for the performance of NTBFs in relation to general. More specifically individual entrepreneurs or entrepreneurial teams with high levels of formal business education, commercial, managerial or same sector experience are found to have created better performing NTBFs. Finally it is found that the performance of a NTBF can improve through the combination of heterogeneous but complementary skills, including, for example, technical education and commercial experience or managerial technical and managerial commercial experience. © 2010 Springer Science+Business Media, LLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper introduces a method for dependencies discovery during human-machine interaction. It is based on an analysis of numerical data sets in knowledge-poor environments. The driven procedures are independent and they interact on a competitive principle. The research focuses on seven of them. The application is in Number Theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Content creation and presentation are key activities in a multimedia digital library (MDL). The proper design and intelligent implementation of these services provide a stable base for overall MDL functionality. This paper presents the framework and the implementation of these services in the latest version of the “Virtual Encyclopaedia of Bulgarian Iconography” multimedia digital library. For the semantic description of the iconographical objects a tree-based annotation template is implemented. It provides options for autocompletion, reuse of values, bilingual entering of data, automated media watermarking, resizing and conversing. The paper describes in detail the algorithm for automated appearance of dependent values for different characteristics of an iconographical object. An algorithm for avoiding duplicate image objects is also included. The service for automated appearance of new objects in a collection after their entering is included as an important part of the content presentation. The paper also presents the overall service-based architecture of the library, covering its main service panels, repositories and their relationships. The presented vision is based on a long-term observation of the users’ preferences, cognitive goals, and needs, aiming to find an optimal functionality solution for the end users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thermal effects in uncontrolled factory environments are often the largest source of uncertainty in large volume dimensional metrology. As the standard temperature for metrology of 20°C cannot be achieved practically or economically in many manufacturing facilities, the characterisation and modelling of temperature offers a solution for improving the uncertainty of dimensional measurement and quantifying thermal variability in large assemblies. Technologies that currently exist for temperature measurement in the range of 0-50°C have been presented alongside discussion of these temperature measurement technologies' usefulness for monitoring temperatures in a manufacturing context. Particular aspects of production where the technology could play a role are highlighted as well as practical considerations for deployment. Contact sensors such as platinum resistance thermometers can produce accuracy closest to the desired accuracy given the most challenging measurement conditions calculated to be ∼0.02°C. Non-contact solutions would be most practical in the light controlled factory (LCF) and semi-invasive appear least useful but all technologies can play some role during the initial development of thermal variability models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Big data comes in various ways, types, shapes, forms and sizes. Indeed, almost all areas of science, technology, medicine, public health, economics, business, linguistics and social science are bombarded by ever increasing flows of data begging to be analyzed efficiently and effectively. In this paper, we propose a rough idea of a possible taxonomy of big data, along with some of the most commonly used tools for handling each particular category of bigness. The dimensionality p of the input space and the sample size n are usually the main ingredients in the characterization of data bigness. The specific statistical machine learning technique used to handle a particular big data set will depend on which category it falls in within the bigness taxonomy. Large p small n data sets for instance require a different set of tools from the large n small p variety. Among other tools, we discuss Preprocessing, Standardization, Imputation, Projection, Regularization, Penalization, Compression, Reduction, Selection, Kernelization, Hybridization, Parallelization, Aggregation, Randomization, Replication, Sequentialization. Indeed, it is important to emphasize right away that the so-called no free lunch theorem applies here, in the sense that there is no universally superior method that outperforms all other methods on all categories of bigness. It is also important to stress the fact that simplicity in the sense of Ockham’s razor non-plurality principle of parsimony tends to reign supreme when it comes to massive data. We conclude with a comparison of the predictive performance of some of the most commonly used methods on a few data sets.