898 resultados para PRODUCT DESIGN
Resumo:
Cette thèse se compose en deux parties: Première Partie: La conception et la synthèse d’analogues pyrrolidiniques, utilisés comme agents anticancéreux, dérivés du FTY720. FTY720 est actuellement commercialisé comme médicament (GilenyaTM) pour le traitement de la sclérose en plaques rémittente-récurrente. Il agit comme immunosuppresseur en raison de son effet sur les récepteurs de la sphingosine-1-phosphate. A fortes doses, FTY720 présente un effet antinéoplasique. Cependant, à de telles doses, un des effets secondaires observé est la bradycardie dû à l’activation des récepteurs S1P1 et S1P3. Ceci limite son potentiel d’utilisation lors de chimiothérapie. Nos précédentes études ont montré que des analogues pyrrolidiniques dérivés du FTY720 présentaient une activité anticancéreuse mais aucune sur les récepteurs S1P1 et S1P3. Nous avons soumis l’idée qu’une étude relation structure-activité (SARs) pourrait nous conduire à la découverte de nouveaux agents anti tumoraux. Ainsi, deux séries de composés pyrrolidiniques (O-arylmethyl substitué et C-arylmethyl substitué) ont pu être envisagés et synthétisés (Chapitre 1). Ces analogues ont montré d’excellentes activités cytotoxiques contre diverses cellules cancéreuses humaines (prostate, colon, sein, pancréas et leucémie), plus particulièrement les analogues actifs qui ne peuvent pas être phosphorylés par SphK, présentent un plus grand potentiel pour le traitement du cancer sans effet secondaire comme la bradycardie. Les études mécanistiques suggèrent que ces analogues de déclencheurs de régulation négative sur les transporteurs de nutriments induisent une crise bioénergétique en affamant les cellules cancéreuses. Afin d’approfondir nos connaissances sur les récepteurs cibles, nous avons conçu et synthétisé des sondes diazirine basées sur le marquage d’affinité aux photons (méthode PAL: Photo-Affinity Labeling) (Chapitre 2). En s’appuyant sur la méthode PAL, il est possible de récolter des informations sur les récepteurs cibles à travers l’analyse LC/MS/MS de la protéine. Ces tests sont en cours et les résultats sont prometteurs. Deuxième partie: Coordination métallique et catalyse di fonctionnelle de dérivés β-hydroxy cétones tertiaires. Les réactions de Barbier et de Grignard sont des méthodes classiques pour former des liaisons carbone-carbone, et généralement utilisées pour la préparation d’alcools secondaires et tertiaires. En vue d’améliorer la réaction de Grignard avec le 1-iodobutane dans les conditions « one-pot » de Barbier, nous avons obtenu comme produit majoritaire la β-hydroxy cétone provenant de l’auto aldolisation de la 5-hexen-2-one, plutôt que le produit attendu d’addition de l’alcool (Chapitre 3). La formation inattendue de la β-hydroxy cétone a également été observée en utilisant d’autres dérivés méthyl cétone. Étonnement dans la réaction intramoléculaire d’une tricétone, connue pour former la cétone Hajos-Parrish, le produit majoritaire est rarement la β-hydroxy cétone présentant la fonction alcool en position axiale. Intrigué par ces résultats et après l’étude systématique des conditions de réaction, nous avons développé deux nouvelles méthodes à travers la synthèse sélective et catalytique de β-hydroxy cétones spécifiques par cyclisation intramoléculaire avec des rendements élevés (Chapitre 4). La réaction peut être catalysée soit par une base adaptée et du bromure de lithium comme additif en passant par un état de transition coordonné au lithium, ou bien soit à l’aide d’un catalyseur TBD di fonctionnel, via un état de transition médiée par une coordination bidenté au TBD. Les mécanismes proposés ont été corroborés par calcul DFT. Ces réactions catalytiques ont également été appliquées à d’autres substrats comme les tricétones et les dicétones. Bien que les efforts préliminaires afin d’obtenir une enantioselectivité se sont révélés sans succès, la synthèse et la recherche de nouveaux catalyseurs chiraux sont en cours.
Resumo:
Cette thèse se compose en deux parties: Première Partie: La conception et la synthèse d’analogues pyrrolidiniques, utilisés comme agents anticancéreux, dérivés du FTY720. FTY720 est actuellement commercialisé comme médicament (GilenyaTM) pour le traitement de la sclérose en plaques rémittente-récurrente. Il agit comme immunosuppresseur en raison de son effet sur les récepteurs de la sphingosine-1-phosphate. A fortes doses, FTY720 présente un effet antinéoplasique. Cependant, à de telles doses, un des effets secondaires observé est la bradycardie dû à l’activation des récepteurs S1P1 et S1P3. Ceci limite son potentiel d’utilisation lors de chimiothérapie. Nos précédentes études ont montré que des analogues pyrrolidiniques dérivés du FTY720 présentaient une activité anticancéreuse mais aucune sur les récepteurs S1P1 et S1P3. Nous avons soumis l’idée qu’une étude relation structure-activité (SARs) pourrait nous conduire à la découverte de nouveaux agents anti tumoraux. Ainsi, deux séries de composés pyrrolidiniques (O-arylmethyl substitué et C-arylmethyl substitué) ont pu être envisagés et synthétisés (Chapitre 1). Ces analogues ont montré d’excellentes activités cytotoxiques contre diverses cellules cancéreuses humaines (prostate, colon, sein, pancréas et leucémie), plus particulièrement les analogues actifs qui ne peuvent pas être phosphorylés par SphK, présentent un plus grand potentiel pour le traitement du cancer sans effet secondaire comme la bradycardie. Les études mécanistiques suggèrent que ces analogues de déclencheurs de régulation négative sur les transporteurs de nutriments induisent une crise bioénergétique en affamant les cellules cancéreuses. Afin d’approfondir nos connaissances sur les récepteurs cibles, nous avons conçu et synthétisé des sondes diazirine basées sur le marquage d’affinité aux photons (méthode PAL: Photo-Affinity Labeling) (Chapitre 2). En s’appuyant sur la méthode PAL, il est possible de récolter des informations sur les récepteurs cibles à travers l’analyse LC/MS/MS de la protéine. Ces tests sont en cours et les résultats sont prometteurs. Deuxième partie: Coordination métallique et catalyse di fonctionnelle de dérivés β-hydroxy cétones tertiaires. Les réactions de Barbier et de Grignard sont des méthodes classiques pour former des liaisons carbone-carbone, et généralement utilisées pour la préparation d’alcools secondaires et tertiaires. En vue d’améliorer la réaction de Grignard avec le 1-iodobutane dans les conditions « one-pot » de Barbier, nous avons obtenu comme produit majoritaire la β-hydroxy cétone provenant de l’auto aldolisation de la 5-hexen-2-one, plutôt que le produit attendu d’addition de l’alcool (Chapitre 3). La formation inattendue de la β-hydroxy cétone a également été observée en utilisant d’autres dérivés méthyl cétone. Étonnement dans la réaction intramoléculaire d’une tricétone, connue pour former la cétone Hajos-Parrish, le produit majoritaire est rarement la β-hydroxy cétone présentant la fonction alcool en position axiale. Intrigué par ces résultats et après l’étude systématique des conditions de réaction, nous avons développé deux nouvelles méthodes à travers la synthèse sélective et catalytique de β-hydroxy cétones spécifiques par cyclisation intramoléculaire avec des rendements élevés (Chapitre 4). La réaction peut être catalysée soit par une base adaptée et du bromure de lithium comme additif en passant par un état de transition coordonné au lithium, ou bien soit à l’aide d’un catalyseur TBD di fonctionnel, via un état de transition médiée par une coordination bidenté au TBD. Les mécanismes proposés ont été corroborés par calcul DFT. Ces réactions catalytiques ont également été appliquées à d’autres substrats comme les tricétones et les dicétones. Bien que les efforts préliminaires afin d’obtenir une enantioselectivité se sont révélés sans succès, la synthèse et la recherche de nouveaux catalyseurs chiraux sont en cours.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Most adverse environmental impacts result from design decisions made long before manufacturing or usage. In order to prevent this situation, several authors have proposed the application of life cycle assessment (LCA) at the very first phases of the design of a process, a product or a service. The study in this paper presents an innovative thermal drying process for sewage sludge called fry-drying, in which dewatered sludge is directly contacted in the dryer with hot recycled cooking oils (RCO) as the heat medium. Considering the practical difficulties for the disposal of these two wastes, fry-drying presents a potentially convenient method for their combined elimination by incineration of the final fry-dried sludge. An analytical comparison between a conventional drying process and the new proposed fry-drying process is reported, with reference to some environmental impact categories. The results of this study, applied at the earliest stages of the design of the process, assist evaluation of the feasibility of such system compared to a current disposal process for the drying and incineration of sewage sludge.
Resumo:
Simplicity in design and minimal floor space requirements render the hydrocyclone the preferred classifier in mineral processing plants. Empirical models have been developed for design and process optimisation but due to the complexity of the flow behaviour in the hydrocyclone these do not provide information on the internal separation mechanisms. To study the interaction of design variables, the flow behaviour needs to be considered, especially when modelling the new three-product cyclone. Computational fluid dynamics (CFD) was used to model the three-product cyclone, in particular the influence of the dual vortex finder arrangement on flow behaviour. From experimental work performed on the UG2 platinum ore, significant differences in the classification performance of the three-product cyclone were noticed with variations in the inner vortex finder length. Because of this simulations were performed for a range of inner vortex finder lengths. Simulations were also conducted on a conventional hydrocyclone of the same size to enable a direct comparison of the flow behaviour between the two cyclone designs. Significantly, high velocities were observed for the three-product cyclone with an inner vortex finder extended deep into the conical section of the cyclone. CFD studies revealed that in the three-product cyclone, a cylindrical shaped air-core is observed similar to conventional hydrocyclones. A constant diameter air-core was observed throughout the inner vortex finder length, while no air-core was present in the annulus. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Objective: Five double-blind, randomized, saline-controlled trials (RCTs) were included in the United States marketing application for an intra-articular hyaluronan (IA-HA) product for the treatment of osteoarthritis (OA) of the knee. We report an integrated analysis of the primary Case Report Form (CRF) data from these trials. Method. Trials were similar in design, patient population and outcome measures - all included the Lequesne Algofunctional Index (LI), a validated composite index of pain and function, evaluating treatment over 3 months. Individual patient data were pooled; a repeated measures analysis of covariance was performed in the intent-to-treat (ITT) population. Analyses utilized both fixed and random effects models. Safety data from the five RCTs were summarized. Results: A total of 1155 patients with radiologically confirmed knee OA were enrolled: 619 received three or five IA-HA injections; 536 received. placebo saline injections. In the active and control groups, mean ages were 61.8 and 61.4 years; 62.4% and 58.8% were women; baseline total Lequesne scores 11.03 and 11.30, respectively. Integrated analysis of the pooled data set found a statistically significant reduction (P < 0.001) in total Lequesne score with hyaluronan (HA) (-2.68) vs placebo (-2.00); estimated difference -0.68 (95% CI: -0.56 to -0.79), effect size 0.20. Additional modeling approaches confirmed robustness of the analyses. Conclusions: This integrated analysis demonstrates that multiple design factors influence the results of RCTs assessing efficacy of intra-articular (IA) therapies, and that integrated analyses based on primary data differ from meta-analyses using transformed data. (C) 2006 OsteoArthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Operation of polymer electrolyte membrane fuel cells with dry feeds: Design and operating strategies
Resumo:
The operation of polymer electrolyte membrane fuel cells (PEMFCs) with dry feeds has been examined with different fuel cell flow channel designs as functions of pressure, temperature and flow rate. Auto-humidified (or self-humidifying) PEMFC operation is improved at higher pressures and low gas velocities where axial dispersion enhances back-mixing of the product water with the dry feed. We demonstrate auto-humidified operation of the channel-less, self-draining fuel cell, based on a stirred tank reactor; data is presented showing auto-humidified operation from 25 to 115 degrees C at 1 and 3 atm. Design and operating requirements are derived for the auto-humidified operation of the channel-less, self-draining fuel cell. The auto-humidified self-draining fuel cell outperforms a fully humidified serpentine flow channel fuel cell at high current densities. The new design offers substantial benefits for simplicity of operation and control including: the ability to self-drain reducing flooding, the ability to uniformly disperse water removing current gradients and the ability to operate on dry feeds eliminating the need for humidifiers. Additionally, the design lends itself well to a modular design concept. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Wildlife feeding is a wide-spread and controversial practice that can pose serious threats to the safety of both wildlife and visitors. The design and effectiveness of warning signs in recreational areas varies considerably and is rarely the product of theoretical models or scientific research. This study uses front-end and formative evaluation to design and test the perceived effectiveness of warning signs relating to bird feeding. Stage One examined visitors' beliefs, attitudes and bird feeding behaviour and found significant differences between feeders and non-feeders. Stage Two involved designing and evaluating three signs that built on the beliefs, knowledge and mis/conceptions identified in Stage One. Respondents thought the sign that focused on the birds' health and safety would be the most persuasive, however, elements of the other two signs were also positively evaluated. The article concludes with recommendations for the wording of future bird feeding warning signs. (c) 2004 Elsevier Ltd. All rights reserved.
Resumo:
Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.
Resumo:
A discrete event simulation model was developed and used to estimate the storage area required for a proposed overseas textile manufacturing facility. It was found that the simulation was able to achieve this because of its ability to both store attribute values and to show queuing levels at an individual product level. It was also found that the process of undertaking the simulation project initiated useful discussions regarding the operation of the facility. Discrete event simulation is shown to be much more than an exercise in quantitative analysis of results and an important task of the simulation project manager is to initiate a debate among decision makers regarding the assumptions of how the system operates.
Resumo:
In designing new product the ability to retrieve drawings of existing components is important if costs are to be controlled by preventing unnecessary duplication if parts. Component coding and classification systems have been used successfully for these purposes but suffer from high operational costs and poor usability arising directly from the manual nature of the coding process itself. A new version of an existing coding system (CAMAC) has been developed to reduce costs by automatically coding engineering drawings. Usability is improved be supporting searches based on a drawing or sketch of the desired component. Test results from a database of several thousand drawings are presented.
Resumo:
Purpose – This paper aims to present a framework that will help manufacturing firms to configure their internal production and support operations to enable effective and efficient delivery of products and their closely associated services. Design/methodology/approach – First, the key definitions and literature sources directly associated with servitization of manufacturing are established. Then, a theoretical framework that categorises the key characteristics of a manufacturer's operations strategy is developed and this is populated using both evidence from the extant literature and empirical data. Findings – The framework captures a set of operations principles, structures and processes that can guide a manufacturer in the delivery of product-centric servitized offering. These are illustrated and contrasted against operations that deliver purely product (production operations) and those which deliver purely services (services operations). Research limitations/implications – The work is based on a review of the literature supported by data collected from an exploratory case study. Whilst it provides an essential platform, further research will be needed to validate the framework. Originality/value – The principal contribution of this paper is a framework that captures the key characteristics of operations for product-centric servitized manufacture.
Resumo:
The objective of this study has been to enable a greater understanding of the biomass gasification process through the development and use of process and economic models. A new theoretical equilibrium model of gasification is described using the operating condition called the adiabatic carbon boundary. This represents an ideal gasifier working at the point where the carbon in the feedstock is completely gasified. The model can be used as a `target' against which the results of real gasifiers can be compared, but it does not simulate the results of real gasifiers. A second model has been developed which uses a stagewise approach in order to model fluid bed gasification, and its results have indicated that pyrolysis and the reactions of pyrolysis products play an important part in fluid bed gasifiers. Both models have been used in sensitivity analyses: the biomass moisture content and gasifying agent composition were found to have the largest effects on performance, whilst pressure and heat loss had lesser effects. Correlations have been produced to estimate the total installed capital cost of gasification systems and have been used in an economic model of gasification. This has been used in a sensitivity analysis to determine the factors which most affect the profitability of gasification. The most important influences on gasifier profitability have been found to be feedstock cost, product selling price and throughput. Given the economic conditions of late 1985, refuse gasification for the production of producer gas was found to be viable at throughputs of about 2.5 tonnes/h dry basis and above, in the metropolitan counties of the United Kingdom. At this throughput and above, the largest element of product gas cost is the feedstock cost, the cost element which is most variable.
Resumo:
Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.
Resumo:
Product reliability and its environmental performance have become critical elements within a product's specification and design. To obtain a high level of confidence in the reliability of the design it is customary to test the design under realistic conditions in a laboratory. The objective of the work is to examine the feasibility of designing mechanical test rigs which exhibit prescribed dynamical characteristics. The design is then attached to the rig and excitation is applied to the rig, which then transmits representative vibration levels into the product. The philosophical considerations made at the outset of the project are discussed as they form the basis for the resulting design methodologies. It is attempted to directly identify the parameters of a test rig from the spatial model derived during the system identification process. It is shown to be impossible to identify a feasible test rig design using this technique. A finite dimensional optimal design methodology is developed which identifies the parameters of a discrete spring/mass system which is dynamically similar to a point coordinate on a continuous structure. This design methodology is incorporated within another procedure which derives a structure comprising a continuous element and a discrete system. This methodology is used to obtain point coordinate similarity for two planes of motion, which is validated by experimental tests. A limitation of this approach is that it is impossible to achieve multi-coordinate similarity due to an interaction of the discrete system and the continuous element at points away from the coordinate of interest. During the work the importance of the continuous element is highlighted and a design methodology is developed for continuous structures. The design methodology is based upon distributed parameter optimal design techniques and allows an initial poor design estimate to be moved in a feasible direction towards an acceptable design solution. Cumulative damage theory is used to provide a quantitative method of assessing the quality of dynamic similarity. It is shown that the combination of modal analysis techniques and cumulative damage theory provides a feasible design synthesis methodology for representative test rigs.