946 resultados para Application techniques
Resumo:
With the growth of the multinational corporation (MNC) has come the need to understand how parent companies transfer knowledge to, and manage the operations of, their subsidiaries. This is of particular interest to manufacturing companies transferring their operations overseas. Japanese companies in particular have been pioneering in this regard, with techniques such as the Toyota Production System (TPS) for transferring the ethos of Japanese manufacturing and maintaining quality and control in overseas subsidiaries. A great deal has been written about the process of transferring Japanese manufacturing techniques, but much less is understood about how the subsidiaries themselves, which are required to make use of such techniques, actually acquire and incorporate them into their operations. The research on which this paper is based therefore examines how, from the perspective of the subsidiary, knowledge of manufacturing techniques is transferred from the parent company. There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate knowledge about manufacturing techniques into their working practices. In-depth qualitative research was, therefore, conducted in the subsidiary of a Japanese multinational, Denso Corporation, involving three main manufacturing initiatives (or philosophies), namely ‘TPS’, ‘TPM’ and ‘TS’. The case data were derived from 52 in-depth interviews with project members, moderate participant observations, and documentations. The aim of this paper is to present the preliminary findings from the case analyses. The research contributes to our understanding of knowledge transfer in relation to the circumstances of the selection between adaptation and replication of knowledge in the subsidiary from its parent. In particular this understanding relates to transfer across different flows and levels in the organisational hierarchy, how the whole process is managed, and also how modification takes place.
Resumo:
This thesis presents the experimental investigation into two novel techniques which can be incorporated into current optical systems. These techniques have the capability to improve the performance of transmission and the recovery of the transmitted signal at the receiver. The experimental objectives are described and the results for each technique are presented in two sections: The first experimental section is on work related to Ultra-long Raman Fibre lasers (ULRFLs). The fibre lasers have become an important research topic in recent years due to the significant improvement they give over lumped Raman amplification and their potential use in the development of system with large bandwidths and very low losses. The experiments involved the use of ASK and DPSK modulation types over a distance of 240km and DPSK over a distance of 320km. These results are compared to the current state of-the-art and against other types of ultra-long transmission amplification techniques. The second technique investigated involves asymmetrical, or offset, filtering. This technique is important because it deals with the strong filtering regimes that are a part of optical systems and networks in modern high-speed communications. It allows the improvement of the received signal by offsetting the central frequency of a filter after the output of a Delay Line Interferometer (DLI), which induces significant improvement in BER and/or Qvalues at the receiver and therefore an increase in signal quality. The experimental results are then concluded against the objectives of the experimental work and potential future work discussed.
Resumo:
Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.
Resumo:
Desktop user interface design originates from the fact that users are stationary and can devote all of their visual resource to the application with which they are interacting. In contrast, users of mobile and wearable devices are typically in motion whilst using their device which means that they cannot devote all or any of their visual resource to interaction with the mobile application -- it must remain with the primary task, often for safety reasons. Additionally, such devices have limited screen real estate and traditional input and output capabilities are generally restricted. Consequently, if we are to develop effective applications for use on mobile or wearable technology, we must embrace a paradigm shift with respect to the interaction techniques we employ for communication with such devices.This paper discusses why it is necessary to embrace a paradigm shift in terms of interaction techniques for mobile technology and presents two novel multimodal interaction techniques which are effective alternatives to traditional, visual-centric interface designs on mobile devices as empirical examples of the potential to achieve this shift.
Resumo:
Computational performance increasingly depends on parallelism, and many systems rely on heterogeneous resources such as GPUs and FPGAs to accelerate computationally intensive applications. However, implementations for such heterogeneous systems are often hand-crafted and optimised to one computation scenario, and it can be challenging to maintain high performance when application parameters change. In this paper, we demonstrate that machine learning can help to dynamically choose parameters for task scheduling and load-balancing based on changing characteristics of the incoming workload. We use a financial option pricing application as a case study. We propose a simulation of processing financial tasks on a heterogeneous system with GPUs and FPGAs, and show how dynamic, on-line optimisations could improve such a system. We compare on-line and batch processing algorithms, and we also consider cases with no dynamic optimisations.
Resumo:
Dynamic asset rating (DAR) is one of the number of techniques that could be used to facilitate low carbon electricity network operation. Previous work has looked at this technique from an asset perspective. This paper focuses, instead, from a network perspective by proposing a dynamic network rating (DNR) approach. The models available for use with DAR are discussed and compared using measured load and weather data from a trial network area within Milton Keynes in the central area of the U.K. This paper then uses the most appropriate model to investigate, through a network case study, the potential gains in dynamic rating compared to static rating for the different network assets - transformers, overhead lines, and cables. This will inform the network operator of the potential DNR gains on an 11-kV network with all assets present and highlight the limiting assets within each season.
Resumo:
This paper examines the application of commercial and non-invasive electroencephalography (EEG)-based brain-computer (BCIs) interfaces with serious games. Two different EEG-based BCI devices were used to fully control the same serious game. The first device (NeuroSky MindSet) uses only a single dry electrode and requires no calibration. The second device (Emotiv EPOC) uses 14 wet sensors requiring additional training of a classifier. User testing was performed on both devices with sixty-two participants measuring the player experience as well as key aspects of serious games, primarily learnability, satisfaction, performance and effort. Recorded feedback indicates that the current state of BCIs can be used in the future as alternative game interfaces after familiarisation and in some cases calibration. Comparative analysis showed significant differences between the two devices. The first device provides more satisfaction to the players whereas the second device is more effective in terms of adaptation and interaction with the serious game.
Resumo:
Many of the current optical transmission techniques were developed for linear communication channels and are constrained by the fibre nonlinearity. This paper discusses the potential for radically different approaches to signal transmission and processing based on using inherently nonlinear techniques.
Resumo:
ACM Computing Classification System (1998): I.7, I.7.5.
Resumo:
The presentation of cultural heritage is difficult comprehensive and constantly updated topic. Researchers often focus more on the different techniques to digitize artifacts of cultural heritage. This work focuses on the overall shape and structure of future multimedia application whose specificity is determined by the topic - Odrysian kingdom. Below is presented a concept for structure and content-based information available for individual kings from Odryssae dynasty. Special attention is paid to the presentation of preserved artifacts associated with the reign of specific rulers. The main concept of multimedia application dedicated to the Odrysian kingdom, it is to be used in teaching programs related to cultural heritage and history of antiquity in universities. The aim of designers is that it can be modified easy for use in museums also.
Resumo:
One of the greatest concerns related to the popularity of GPS-enabled devices and applications is the increasing availability of the personal location information generated by them and shared with application and service providers. Moreover, people tend to have regular routines and be characterized by a set of “significant places”, thus making it possible to identify a user from his/her mobility data. In this paper we present a series of techniques for identifying individuals from their GPS movements. More specifically, we study the uniqueness of GPS information for three popular datasets, and we provide a detailed analysis of the discriminatory power of speed, direction and distance of travel. Most importantly, we present a simple yet effective technique for the identification of users from location information that are not included in the original dataset used for training, thus raising important privacy concerns for the management of location datasets.
Resumo:
In this paper, we discuss recent advances in digital signal processing techniques for compensation of the laser phase noise and fiber nonlinearity impairments in coherent optical orthogonal frequency division multiplexing (CO-OFDM) transmission. For laser phase noise compensation, we focus on quasi-pilot-aided (QPA) and decision-directed-free blind (DDF-blind) phase noise compensation techniques. For fiber nonlinearity compensation, we discuss in details the principle and performance of the phase-conjugated pilots (PCP) scheme.
Resumo:
Premium Intraocular Lenses (IOLs) such as toric IOLs, multifocal IOLs (MIOLs) and accommodating IOLs (AIOLs) can provide better refractive and visual outcomes compared to standard monofocal designs, leading to greater levels of post-operative spectacle independence. The principal theme of this thesis relates to the development of new assessment techniques that can help to improve future premium IOL design. IOLs designed to correct astigmatism form the focus of the first part of the thesis. A novel toric IOL design was devised to decrease the effect of toric rotation on patient visual acuity, but found to have neither a beneficial or detrimental impact on visual acuity retention. IOL tilt, like rotation, may curtail visual performance; however current IOL tilt measurement techniques require the use of specialist equipment not readily available in most ophthalmological clinics. Thus a new idea that applied Pythagoras’s theory to digital images of IOL optic symmetricality in order to calculate tilt was proposed, and shown to be both accurate and highly repeatable. A literature review revealed little information on the relationship between IOL tilt, decentration and rotation and so this was examined. A poor correlation between these factors was found, indicating they occur independently of each other. Next, presbyopia correcting IOLs were investigated. The light distribution of different MIOLs and an AIOL was assessed using perimetry, to establish whether this could be used to inform optimal IOL design. Anticipated differences in threshold sensitivity between IOLs were not however found, thus perimetry was concluded to be ineffective in mapping retinal projection of blur. The observed difference between subjective and objective measures of accommodation, arising from the influence of pseudoaccommodative factors, was explored next to establish how much additional objective power would be required to restore the eye’s focus with AIOLs. Blur tolerance was found to be the key contributor to the ocular depth of focus, with an approximate dioptric influence of 0.60D. Our understanding of MIOLs may be limited by the need for subjective defocus curves, which are lengthy and do not permit important additional measures to be undertaken. The use of aberrometry to provide faster objective defocus curves was examined. Although subjective and objective measures related well, the peaks of the MIOL defocus curve profile were not evident with objective prediction of acuity, indicating a need for further refinement of visual quality metrics based on ocular aberrations. The experiments detailed in the thesis evaluate methods to improve visual performance with toric IOLs. They also investigate new techniques to allow more rapid post-operative assessment of premium IOLs, which could allow greater insights to be obtained into several aspects of visual quality, in order to optimise future IOL design and ultimately enhance patient satisfaction.
Resumo:
Potent-selective peptidomimetic inhibitors of tissue transglutaminase (TG2) were developed through a combination of protein-ligand docking and molecular dynamic techniques. Derivatives of these inhibitors were made with the aim of specific TG2 targeting to the intra- and extracellular space. A cell-permeable fluorescently labeled derivative enabled detection of in situ cellular TG2 activity in human umbilical cord endothelial cells and TG2-transduced NIH3T3 cells, which could be enhanced by treatment of cells with ionomycin. Reaction of TG2 with this fluorescent inhibitor in NIH3T3 cells resulted in loss of binding of TG2 to cell surface syndecan-4 and inhibition of translocation of the enzyme into the extracellular matrix, with a parallel reduction in fibronectin deposition. In human umbilical cord endothelial cells, this same fluorescent inhibitor also demonstrated a reduction in fibronectin deposition, cell motility, and cord formation in Matrigel. Use of the same inhibitor in a mouse model of hypertensive nephrosclerosis showed over a 40% reduction in collagen deposition.
Resumo:
Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.