968 resultados para consideration
Resumo:
Data envelopment analysis (DEA) is defined based on observed units and by finding the distance of each unit to the border of estimated production possibility set (PPS). The convexity is one of the underlying assumptions of the PPS. This paper shows some difficulties of using standard DEA models in the presence of input-ratios and/or output-ratios. The paper defines a new convexity assumption when data includes a ratio variable. Then it proposes a series of modified DEA models which are capable to rectify this problem.
Resumo:
Practising engineers frequently seek to understand what the effects of various manufacturing strategies will be on the performance of their production facilities. In this situation a computer model can help to provide insight and form predictions about future manufacturing system performance. Various types of modelling methods exist and each provide models that have distinct characteristics. This paper presents a review of popular modelling techniques and, based on the results of a structured experimental study, summarises their capabilities to support the evaluation of manufacturing strategies.
Resumo:
The fundamentals of this research were to exploit non-ionic surfactant technology for delivery and administration of vaccine antigens across the oral route and to gain a better understanding of vaccine trafficking. Using a newly developed method for manufacture of non-ionic surfactant vesicles (niosomes and bilosomes) lower process temperatures were adopted thus reducing antigen exposure to potentially damaging conditions. Vesicles prepared by this method offered high protection to enzymatic degradation, with only ~10 % antigen loss measured when vesicles incorporating antigen were exposed to enzyme digestion. Interestingly, when formulated using this new production method, the addition of bile salt to the vesicles offered no advantage in terms of stability within simulated gastro-intestinal conditions. Considering their ability to deliver antigen to their target site, results demonstrated that incorporation of antigen within vesicles enhanced delivery and targeting of the antigen to the Peyer's Patch, again with niosomes and bilosomes offering similar efficiency. Delivery to both the Peyer's patches and mesentery lymphatics was shown to be dose dependent at lower concentrations, with saturation kinetics applying at higher concentrations. This demonstrates that in the formulation of vaccine delivery systems, the lipid/antigen dose ratio is not only a key factor in production cost, but is equally a key factor in the kinetics of delivery and targeting of a vaccine system. © 2013 Controlled Release Society.
Resumo:
The fundamentals of this research were to exploit non-ionic surfactant technology for delivery and administration of vaccine antigens across the oral route and to gain a better understanding of vaccine trafficking. Using a newly developed method for manufacture of non-ionic surfactant vesicles (niosomes and bilosomes) lower process temperatures were adopted thus reducing antigen exposure to potentially damaging conditions. Vesicles prepared by this method offered high protection to enzymatic degradation, with only ~10 % antigen loss measured when vesicles incorporating antigen were exposed to enzyme digestion. Interestingly, when formulated using this new production method, the addition of bile salt to the vesicles offered no advantage in terms of stability within simulated gastro-intestinal conditions. Considering their ability to deliver antigen to their target site, results demonstrated that incorporation of antigen within vesicles enhanced delivery and targeting of the antigen to the Peyer's Patch, again with niosomes and bilosomes offering similar efficiency. Delivery to both the Peyer's patches and mesentery lymphatics was shown to be dose dependent at lower concentrations, with saturation kinetics applying at higher concentrations. This demonstrates that in the formulation of vaccine delivery systems, the lipid/antigen dose ratio is not only a key factor in production cost, but is equally a key factor in the kinetics of delivery and targeting of a vaccine system. © 2013 Controlled Release Society.
Resumo:
Legislation: Regulation 6/2002 on Community designs art.3(3)(e) Directive 98/71 on the legal protection of designs art.7(1) Cases: Dyson Ltd v Vax Ltd [2010] EWHC 1923 (Pat); [2011] Bus. L.R. 232 (Ch D (Patents Ct)) Lego Juris A/S v Office for Harmonisation in the Internal Market (Trade Marks and Designs) (OHIM) (C-48/09 P) Unreported September 14, 2010 (ECJ) *E.I.P.R. 60 In Lego, the Court of Justice of the European Union denied registration for an exclusively functional shape mark despite the availability of other shapes capable of fulfilling the same function and in Dyson v Vax Mr Justice Arnold established that a design can not be registered for a purely functional shape even though another shape could fulfil the same required function.
Resumo:
An integrated production–recycling system is investigated. A constant demand can be satisfied by production and recycling. The used items might be bought back and then recycled. The not recycled products are disposed off. Two types of models are analyzed. The first model examines and minimizes the EOQ related cost. The second model generalizes the first one by introducing additionally linear waste disposal, recycling, production and buyback costs. This basic model was examined by the authors in a previous paper. The main results are that a pure strategy (either production or recycling) is optimal. This paper extends the model for the case of quality consideration: it is asked for the quality of the bought back products. In the former model we have assumed that all returned items are serviceable. One can put the following question: Who should control the quality of the returned items? If the suppliers examine the quality of the reusable products, then the buyback rate is strongly smaller than one, α<1. If the user does it, then not all returned items are recyclable, i.e. the use rate is smaller than one, δ<1. Which one of the control systems are more cost advantageous in this case?
Resumo:
Historically, abdominal complaints have been generally dealt with palliatively. Seldom were underlying causes given consideration. However, in 1982 (Warren & Marshall, 1983), the identification of the bacterial agent Helicobacter pylori (known hereafter in this paper as H. pylori) as a potential link between gastrointestinal complaints such as gastric and duodenal ulcers, Crohn's Disease, and some forms of gastric cancer has given rise for concern. In 1994, the National Institute for Health recommended that patients with complaints of dyspepsia be studied for the occurrence of H. pylori. This study proposes to study the occurrence of H. pylori in patients who complain with dyspepsia with a relatively non invasive screening technique to be done in an office setting. The study findings were considered signifcant if p $\le$.05. This study indicated that 49% of patients with complaints of dyspepsia were postive for H. pylori infection with p =.000. ^
Resumo:
Peer reviewed
Resumo:
Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.
Resumo:
Master final project submitted to the faculty of the Historic Preservation Program of the School of Architecture, Planning, and Preservation of the University of Maryland, College Park, in partial fulfillment of the requirements for the degree of Master of Historic Preservation, 2013.
Resumo:
Simultaneous Localization and Mapping (SLAM) is a procedure used to determine the location of a mobile vehicle in an unknown environment, while constructing a map of the unknown environment at the same time. Mobile platforms, which make use of SLAM algorithms, have industrial applications in autonomous maintenance, such as the inspection of flaws and defects in oil pipelines and storage tanks. A typical SLAM consists of four main components, namely, experimental setup (data gathering), vehicle pose estimation, feature extraction, and filtering. Feature extraction is the process of realizing significant features from the unknown environment such as corners, edges, walls, and interior features. In this work, an original feature extraction algorithm specific to distance measurements obtained through SONAR sensor data is presented. This algorithm has been constructed by combining the SONAR Salient Feature Extraction Algorithm and the Triangulation Hough Based Fusion with point-in-polygon detection. The reconstructed maps obtained through simulations and experimental data with the fusion algorithm are compared to the maps obtained with existing feature extraction algorithms. Based on the results obtained, it is suggested that the proposed algorithm can be employed as an option for data obtained from SONAR sensors in environment, where other forms of sensing are not viable. The algorithm fusion for feature extraction requires the vehicle pose estimation as an input, which is obtained from a vehicle pose estimation model. For the vehicle pose estimation, the author uses sensor integration to estimate the pose of the mobile vehicle. Different combinations of these sensors are studied (e.g., encoder, gyroscope, or encoder and gyroscope). The different sensor fusion techniques for the pose estimation are experimentally studied and compared. The vehicle pose estimation model, which produces the least amount of error, is used to generate inputs for the feature extraction algorithm fusion. In the experimental studies, two different environmental configurations are used, one without interior features and another one with two interior features. Numerical and experimental findings are discussed. Finally, the SLAM algorithm is implemented along with the algorithms for feature extraction and vehicle pose estimation. Three different cases are experimentally studied, with the floor of the environment intentionally altered to induce slipping. Results obtained for implementations with and without SLAM are compared and discussed. The present work represents a step towards the realization of autonomous inspection platforms for performing concurrent localization and mapping in harsh environments.
Resumo:
The business philosophy of Mass Customisation (MC) implies rapid response to customer requests, high efficiency and limited cost overheads of customisation. Furthermore, it also implies the quality benefits of the mass production paradigm are guaranteed. However, traditional quality science in manufacturing is premised on volume production of uniform products rather than of differentiated products associated with MC. This creates quality challenges and raises questions over the suitability of standard quality engineering techniques. From an analysis of relevant MC and quality literature it is argued the aims of MC are aligned with contemporary thinking on quality and that quality concepts provide insights into MC. Quality issues are considered along three dimensions - product development, order fulfilment and customer interaction. The applicability and effectiveness of conventional quality engineering techniques are discussed and a framework is presented which identifies key issues with respect to quality for a spectrum of MC strategies.