884 resultados para Design of activities


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the question of whether R&D should be carried out by an independent research unit or be produced in-house by the firm marketing the innovation. We define two organizational structures. In an integrated structure, the firm that markets the innovation also carries out and finances research leading to the innovation. In an independent structure, the firm that markets the innovation buys it from an independent research unit which is financed externally. We compare the two structures under the assumption that the research unit has some private information about the real cost of developing the new product. When development costs are negatively correlated with revenues from the innovation, the integrated structure dominates. The independent structure dominates in the opposite case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article discusses the design of a comprehensive evaluation of a community development programme for young people 'at-risk' of self-harming behaviour. It outlines considerations in the design of the evaluation and focuses on the complexities and difficulties associated with the evaluation of a community development programme. The challenge was to fulfil the needs of the funding body for a broad, outcome-focused evaluation while remaining close enough to the programme to accurately represent its activities and potential effects at a community level. Specifically, the strengths and limitations of a mixed-method evaluation plan are discussed with recommendations for future evaluation practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work agenda includes the production of a report on different doctoral programmes on “Technology Assessment” in Europe, the US and Japan, in order to analyse collaborative post-graduation activities. Finally, the proposals on collaborative post-graduation programme between FCTUNL and ITAS-FZK will be organised by an ongoing discussion process with colleagues from ITAS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia e Gestão Industrial

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Los materiales lignocelulósicos residuales de las actividades agroindustriales pueden ser aprovechados como fuente de lignina, hemicelulosa y celulosa. El tratamiento químico del material lignocelulósico se debe enfrentar al hecho de que dicho material es bastante recalcitrante a tal ataque, fundamentalmente debido a la presencia del polímero lignina. Esto se puede lograr también utilizando hongos de la podredumbre blanca de la madera. Estos producen enzimas lignolíticas extracelulares fundamentalmente Lacasa, que oxida la lignina a CO2. Tambien oxida un amplio rango de sustratos ( fenoles, polifenoles, anilinas, aril-diaminas, fenoles metoxi-sustituídos, y otros), lo cual es una buena razón de su atracción para aplicaciones biotecnológicas. La enzima tiene potencial aplicación en procesos tales como en la delignificación de materiales lignocelulósicos y en el bioblanqueado de pulpas para papel, en el tratamiento de aguas residuales de plantas industriales, en la modificación de fibras y decoloración en industrias textiles y de colorantes, en el mejoramiento de alimentos para animales, en la detoxificación de polutantes y en bioremediación de suelos contaminados. También se la ha utilizado en Q.Orgánica para la oxidación de grupos funcionales, en la formación de enlaces carbono- nitrógeno y en la síntesis de productos naturales complejos. HIPOTESIS: Los hongos de podredumbre blanca, y en condiciones óptimas de cultivo producen distintos tipos de enzimas oxidasas, siendo las lacasas las más adecuadas para explorarlas como catalizadores en los siguientes procesos:  Delignificación de residuos de la industria forestal con el fin de aprovechar tales desechos en la alimentación animal.  Decontaminación/remediación de suelos y/o efluentes industriales. Se realizarán los estudios para el diseño de bio-reactores que permitan responder a las dos cuestiones planteadas en la hipótesis. Para el proceso de delignificación de material lignocelulósico se proponen dos estrategias: 1- tratar el material con el micelio del hongo adecuando la provisión de nutrientes para un desarrollo sostenido y favorecer la liberación de la enzima. 2- Utilizar la enzima lacasa parcialmente purificada acoplada a un sistema mediador para oxidar los compuestos polifenólicos. Para el proceso de decontaminación/remediación de suelos y/o efluentes industriales se trabajará también en dos frentes: 3) por un lado, se ha descripto que existe una correlación positiva entre la actividad de algunas enzimas presentes en el suelo y la fertilidad. En este sentido se conoce que un sistema enzimático, tentativamente identificado como una lacasa de origen microbiano es responsable de la transformación de compuestos orgánicos en el suelo. La enzima protege al suelo de la acumulación de compuestos orgánicos peligrosos catalizando reacciones que involucran degradación, polimerización e incorporación a complejos del ácido húmico. Se utilizarán suelos incorporados con distintos polutantes(por ej. policlorofenoles ó cloroanilinas.) 4) Se trabajará con efluentes industriales contaminantes (alpechínes y/o el efluente líquido del proceso de desamargado de las aceitunas). The lignocellulosic raw materials of the agroindustrial activities can be taken advantage as source of lignin, hemicellulose and cellulose. The chemical treatment of this material is not easy because the above mentioned material is recalcitrant enough to such an assault, due to the presence of the lignin. This can be achieved also using the white-rot fungi of the wood. It produces extracellular ligninolitic enzymes, fundamentally Laccase, which oxidizes the lignin to CO2. The enzyme has application in such processes as in the delignification of lignocellulosic materials and in the biobleaching of fibers for paper industry, in the treatment of waste water of industrial plants, in the discoloration in textile industries, in the improvement of food for ruminants, in the detoxification of polutants and in bioremediation of contaminated soils. HYPOTHESIS: The white-rot fungi produce different types of enzymes, being the laccases the most adapted to explore them as catalysts in the following processes:  Delignification of residues of the forest industry in order to take advantage of such waste in the animal feed.  Decontamination of soils and / or waste waters. The studies will be conducted for the design of bio reactors that allow to answer to both questions raised in the hypothesis. For the delignification process of lignocellulosic material they propose two strategies: 1- to treat the material with the fungi 2-to use the partially purified enzyme to oxidize the polyphenolic compounds. For the soil and/or waste water decontamination process, we have: 3- Is know that the enzyme protects to the soil of the accumulation of organic dangerous compounds catalyzing reactions that involve degradation, polymerization and incorporation to complexes of the humic acid. There will be use soils incorporated into different pollutants. 4- We will work with waste waters (alpechins or the green olive debittering effluents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new strategy for the rapid identification of new malaria antigens based on protein structural motifs was previously described. We identified and evaluated the malaria vaccine potential of fragments of several malaria antigens containing α-helical coiled coil protein motifs. By taking advantage of the relatively short size of these structural fragments, we constructed different poly-epitopes in which 3 or 4 of these segments were joined together via a non-immunogenic linker. Only peptides that are targets of human antibodies with anti-parasite in vitro biological activities were incorporated. One of the constructs, P181, was well recognized by sera and peripheral blood mononuclear cells (PBMC) of adults living in malaria-endemic areas. Affinity purified antigen-specific human antibodies and sera from P181-immunized mice recognised native proteins on malaria-infected erythrocytes in both immunofluorescence and western blot assays. In addition, specific antibodies inhibited parasite development in an antibody dependent cellular inhibition (ADCI) assay. Naturally induced antigen-specific human antibodies were at high titers and associated with clinical protection from malaria in longitudinal follow-up studies in Senegal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indoleamine 2,3-dioxygenase 1 (IDO1) is an important therapeutic target for the treatment of diseases such as cancer that involve pathological immune escape. Starting from the scaffold of our previously discovered IDO1 inhibitor 4-phenyl-1,2,3-triazole, we used computational structure-based methods to design more potent ligands. This approach yielded highly efficient low molecular weight inhibitors, the most active being of nanomolar potency both in an enzymatic and in a cellular assay, while showing no cellular toxicity and a high selectivity for IDO1 over tryptophan 2,3-dioxygenase (TDO). A quantitative structure-activity relationship based on the electrostatic ligand-protein interactions in the docked binding modes and on the quantum chemically derived charges of the triazole ring demonstrated a good explanatory power for the observed activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adoptive cell transfer using engineered T cells is emerging as a promising treatment for metastatic melanoma. Such an approach allows one to introduce T cell receptor (TCR) modifications that, while maintaining the specificity for the targeted antigen, can enhance the binding and kinetic parameters for the interaction with peptides (p) bound to major histocompatibility complexes (MHC). Using the well-characterized 2C TCR/SIYR/H-2K(b) structure as a model system, we demonstrated that a binding free energy decomposition based on the MM-GBSA approach provides a detailed and reliable description of the TCR/pMHC interactions at the structural and thermodynamic levels. Starting from this result, we developed a new structure-based approach, to rationally design new TCR sequences, and applied it to the BC1 TCR targeting the HLA-A2 restricted NY-ESO-1157-165 cancer-testis epitope. Fifty-four percent of the designed sequence replacements exhibited improved pMHC binding as compared to the native TCR, with up to 150-fold increase in affinity, while preserving specificity. Genetically engineered CD8(+) T cells expressing these modified TCRs showed an improved functional activity compared to those expressing BC1 TCR. We measured maximum levels of activities for TCRs within the upper limit of natural affinity, K D = ∼1 - 5 μM. Beyond the affinity threshold at K D < 1 μM we observed an attenuation in cellular function, in line with the "half-life" model of T cell activation. Our computer-aided protein-engineering approach requires the 3D-structure of the TCR-pMHC complex of interest, which can be obtained from X-ray crystallography. We have also developed a homology modeling-based approach, TCRep 3D, to obtain accurate structural models of any TCR-pMHC complexes when experimental data is not available. Since the accuracy of the models depends on the prediction of the TCR orientation over pMHC, we have complemented the approach with a simplified rigid method to predict this orientation and successfully assessed it using all non-redundant TCR-pMHC crystal structures available. These methods potentially extend the use of our TCR engineering method to entire TCR repertoires for which no X-ray structure is available. We have also performed a steered molecular dynamics study of the unbinding of the TCR-pMHC complex to get a better understanding of how TCRs interact with pMHCs. This entire rational TCR design pipeline is now being used to produce rationally optimized TCRs for adoptive cell therapies of stage IV melanoma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The vast majority of our contemporary society owns a mobile phone, which has resulted in a dramatic rise in the amount of networked computers in recent years. Security issues in the computers have followed the same trend and nearly everyone is now affected by such issues. How could the situation be improved? For software engineers, an obvious answer is to build computer software with security in mind. A problem with building software with security is how to define secure software or how to measure security. This thesis divides the problem into three research questions. First, how can we measure the security of software? Second, what types of tools are available for measuring security? And finally, what do these tools reveal about the security of software? Measuring tools of these kind are commonly called metrics. This thesis is focused on the perspective of software engineers in the software design phase. Focus on the design phase means that code level semantics or programming language specifics are not discussed in this work. Organizational policy, management issues or software development process are also out of the scope. The first two research problems were studied using a literature review while the third was studied using a case study research. The target of the case study was a Java based email server called Apache James, which had details from its changelog and security issues available and the source code was accessible. The research revealed that there is a consensus in the terminology on software security. Security verification activities are commonly divided into evaluation and assurance. The focus of this work was in assurance, which means to verify one’s own work. There are 34 metrics available for security measurements, of which five are evaluation metrics and 29 are assurance metrics. We found, however, that the general quality of these metrics was not good. Only three metrics in the design category passed the inspection criteria and could be used in the case study. The metrics claim to give quantitative information on the security of the software, but in practice they were limited to evaluating different versions of the same software. Apart from being relative, the metrics were unable to detect security issues or point out problems in the design. Furthermore, interpreting the metrics’ results was difficult. In conclusion, the general state of the software security metrics leaves a lot to be desired. The metrics studied had both theoretical and practical issues, and are not suitable for daily engineering workflows. The metrics studied provided a basis for further research, since they pointed out areas where the security metrics were necessary to improve whether verification of security from the design was desired.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Procurement is one of major business operations in public service sector. The advance of information and communication technology (ICT) pushes this business operation to increase its efficiency and foster collaborations between the organization and its suppliers. This leads to a shift from the traditional procurement transactions to an e-procurement paradigm. Such change impacts on business process, information management and decision making. E-procurement involves various stakeholders who engage in activities based on different social and cultural practices. Therefore, a design of e-procurement system may involve complex situations analysis. This paper describes an approach of using the problem articulation method to support such analysis. This approach is applied to a case study from UAE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traditional venom immunotherapy uses injections of whole bee venom in buffer or adsorbed in Al (OH)(3) in an expensive, time-consuming way. New strategies to improve the safety and efficacy of this treatment with a reduction of injections would, therefore, be of general interest. It would improve patient compliance and provide socio-economic benefits. Liposomes have a long tradition in drug delivery because they increase the therapeutic index and avoid drug degradation and secondary effects. However, bee venom melittin (Mel) and phospholipase (PLA(2)) destroy the phospholipid membranes. Our central idea was to inhibit the PLA(2) and Mel activities through histidine alkylation and or tryptophan oxidation (with pbb, para-bromo-phenacyl bromide, and/or NBSN-bromosuccinimide, respectively) to make their encapsulations possible within stabilized liposomes. We strongly believe that this formulation will be nontoxic but immunogenic. In this paper, we present the whole bee venom conformation characterization during and after chemical modification and after interaction with liposome by ultraviolet, circular dichroism, and fluorescence spectroscopies. The PLA(2) and Mel activities were, measured indirectly by changes in turbidity at 400(nm), rhodamine leak-out, and hemolysis. The native whole bee venom (BV) presented 78.06% of alpha-helical content. The alkylation (A-BV) and succynilation (S-BV) of BV increased 0.44 and 0.20% of its alpha-helical content. The double-modified venom (S-A-BV) had a 0.74% increase of alpha-helical content. The BV chemical modification induced another change on protein conformations observed by Trp that became buried with respect to the native whole BV. It was demonstrated that the liposomal membranes must contain pbb (SPC:Cho:pbb, 26:7:1) as a component to protect them from aggregation and/or fusion. The membranes containing pbb maintained the same turbidity (100%) after incubation with modified venom, in contrast with pbb-free membranes that showed a 15% size decrease. This size decrease was interpreted as membrane degradation and was corroborated by a 50% rhodamine leak-out. Another fact that confirmed our interpretation was the observed 100% inhibition of the hemolytic activity after venom modification with pbb and NBS (S-A-BV). When S-A-BV interacted with liposomes, other protein conformational changes were observed and characterized by the increase of 1.93% on S-A-BV alpha-helical content and the presence of tryptophan residues in a more hydrophobic environment. In other words, the S-A-BV interacted with liposomal membranes, but this interaction was not effective to cause aggregation, leak-out, or fusion. A stable formulation composed by S-A-BV encapsulated within liposomes composed by SPC:Cho:pbb, at a ratio of 26:7:1, was devised. Large unilamellar vesicles of 202.5 nm with a negative surface charge (-24.29 mV) encapsulated 95% of S-A-BV. This formulation can, now, be assayed on VIT.