997 resultados para Modified areas


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary This dissertation explores how stakeholder dialogue influences corporate processes, and speculates about the potential of this phenomenon - particularly with actors, like non-governmental organizations (NGOs) and other representatives of civil society, which have received growing attention against a backdrop of increasing globalisation and which have often been cast in an adversarial light by firms - as a source of teaming and a spark for innovation in the firm. The study is set within the context of the introduction of genetically-modified organisms (GMOs) in Europe. Its significance lies in the fact that scientific developments and new technologies are being generated at an unprecedented rate in an era where civil society is becoming more informed, more reflexive, and more active in facilitating or blocking such new developments, which could have the potential to trigger widespread changes in economies, attitudes, and lifestyles, and address global problems like poverty, hunger, climate change, and environmental degradation. In the 1990s, companies using biotechnology to develop and offer novel products began to experience increasing pressure from civil society to disclose information about the risks associated with the use of biotechnology and GMOs, in particular. Although no harmful effects for humans or the environment have been factually demonstrated even to date (2008), this technology remains highly-contested and its introduction in Europe catalysed major companies to invest significant financial and human resources in stakeholder dialogue. A relatively new phenomenon at the time, with little theoretical backing, dialogue was seen to reflect a move towards greater engagement with stakeholders, commonly defined as those "individuals or groups with which. business interacts who have a 'stake', or vested interest in the firm" (Carroll, 1993:22) with whom firms are seen to be inextricably embedded (Andriof & Waddock, 2002). Regarding the organisation of this dissertation, Chapter 1 (Introduction) describes the context of the study, elaborates its significance for academics and business practitioners as an empirical work embedded in a sector at the heart of the debate on corporate social responsibility (CSR). Chapter 2 (Literature Review) traces the roots and evolution of CSR, drawing on Stakeholder Theory, Institutional Theory, Resource Dependence Theory, and Organisational Learning to establish what has already been developed in the literature regarding the stakeholder concept, motivations for engagement with stakeholders, the corporate response to external constituencies, and outcomes for the firm in terms of organisational learning and change. I used this review of the literature to guide my inquiry and to develop the key constructs through which I viewed the empirical data that was gathered. In this respect, concepts related to how the firm views itself (as a victim, follower, leader), how stakeholders are viewed (as a source of pressure and/or threat; as an asset: current and future), corporate responses (in the form of buffering, bridging, boundary redefinition), and types of organisational teaming (single-loop, double-loop, triple-loop) and change (first order, second order, third order) were particularly important in building the key constructs of the conceptual model that emerged from the analysis of the data. Chapter 3 (Methodology) describes the methodology that was used to conduct the study, affirms the appropriateness of the case study method in addressing the research question, and describes the procedures for collecting and analysing the data. Data collection took place in two phases -extending from August 1999 to October 2000, and from May to December 2001, which functioned as `snapshots' in time of the three companies under study. The data was systematically analysed and coded using ATLAS/ti, a qualitative data analysis tool, which enabled me to sort, organise, and reduce the data into a manageable form. Chapter 4 (Data Analysis) contains the three cases that were developed (anonymised as Pioneer, Helvetica, and Viking). Each case is presented in its entirety (constituting a `within case' analysis), followed by a 'cross-case' analysis, backed up by extensive verbatim evidence. Chapter 5 presents the research findings, outlines the study's limitations, describes managerial implications, and offers suggestions for where more research could elaborate the conceptual model developed through this study, as well as suggestions for additional research in areas where managerial implications were outlined. References and Appendices are included at the end. This dissertation results in the construction and description of a conceptual model, grounded in the empirical data and tied to existing literature, which portrays a set of elements and relationships deemed important for understanding the impact of stakeholder engagement for firms in terms of organisational learning and change. This model suggests that corporate perceptions about the nature of stakeholder influence the perceived value of stakeholder contributions. When stakeholders are primarily viewed as a source of pressure or threat, firms tend to adopt a reactive/defensive posture in an effort to manage stakeholders and protect the firm from sources of outside pressure -behaviour consistent with Resource Dependence Theory, which suggests that firms try to get control over extemal threats by focussing on the relevant stakeholders on whom they depend for critical resources, and try to reverse the control potentially exerted by extemal constituencies by trying to influence and manipulate these valuable stakeholders. In situations where stakeholders are viewed as a current strategic asset, firms tend to adopt a proactive/offensive posture in an effort to tap stakeholder contributions and connect the organisation to its environment - behaviour consistent with Institutional Theory, which suggests that firms try to ensure the continuing license to operate by internalising external expectations. In instances where stakeholders are viewed as a source of future value, firms tend to adopt an interactive/innovative posture in an effort to reduce or widen the embedded system and bring stakeholders into systems of innovation and feedback -behaviour consistent with the literature on Organisational Learning, which suggests that firms can learn how to optimize their performance as they develop systems and structures that are more adaptable and responsive to change The conceptual model moreover suggests that the perceived value of stakeholder contribution drives corporate aims for engagement, which can be usefully categorised as dialogue intentions spanning a continuum running from low-level to high-level to very-high level. This study suggests that activities aimed at disarming critical stakeholders (`manipulation') providing guidance and correcting misinformation (`education'), being transparent about corporate activities and policies (`information'), alleviating stakeholder concerns (`placation'), and accessing stakeholder opinion ('consultation') represent low-level dialogue intentions and are experienced by stakeholders as asymmetrical, persuasive, compliance-gaining activities that are not in line with `true' dialogue. This study also finds evidence that activities aimed at redistributing power ('partnership'), involving stakeholders in internal corporate processes (`participation'), and demonstrating corporate responsibility (`stewardship') reflect high-level dialogue intentions. This study additionally finds evidence that building and sustaining high-quality, trusted relationships which can meaningfully influence organisational policies incline a firm towards the type of interactive, proactive processes that underpin the development of sustainable corporate strategies. Dialogue intentions are related to type of corporate response: low-level intentions can lead to buffering strategies; high-level intentions can underpin bridging strategies; very high-level intentions can incline a firm towards boundary redefinition. The nature of corporate response (which encapsulates a firm's posture towards stakeholders, demonstrated by the level of dialogue intention and the firm's strategy for dealing with stakeholders) favours the type of learning and change experienced by the organisation. This study indicates that buffering strategies, where the firm attempts to protect itself against external influences and cant' out its existing strategy, typically lead to single-loop learning, whereby the firm teams how to perform better within its existing paradigm and at most, improves the performance of the established system - an outcome associated with first-order change. Bridging responses, where the firm adapts organisational activities to meet external expectations, typically leads a firm to acquire new behavioural capacities characteristic of double-loop learning, whereby insights and understanding are uncovered that are fundamentally different from existing knowledge and where stakeholders are brought into problem-solving conversations that enable them to influence corporate decision-making to address shortcomings in the system - an outcome associated with second-order change. Boundary redefinition suggests that the firm engages in triple-loop learning, where the firm changes relations with stakeholders in profound ways, considers problems from a whole-system perspective, examining the deep structures that sustain the system, producing innovation to address chronic problems and develop new opportunities - an outcome associated with third-order change. This study supports earlier theoretical and empirical studies {e.g. Weick's (1979, 1985) work on self-enactment; Maitlis & Lawrence's (2007) and Maitlis' (2005) work and Weick et al's (2005) work on sensegiving and sensemaking in organisations; Brickson's (2005, 2007) and Scott & Lane's (2000) work on organisational identity orientation}, which indicate that corporate self-perception is a key underlying factor driving the dynamics of organisational teaming and change. Such theorizing has important implications for managerial practice; namely, that a company which perceives itself as a 'victim' may be highly inclined to view stakeholders as a source of negative influence, and would therefore be potentially unable to benefit from the positive influence of engagement. Such a selfperception can blind the firm from seeing stakeholders in a more positive, contributing light, which suggests that such firms may not be inclined to embrace external sources of innovation and teaming, as they are focussed on protecting the firm against disturbing environmental influences (through buffering), and remain more likely to perform better within an existing paradigm (single-loop teaming). By contrast, a company that perceives itself as a 'leader' may be highly inclined to view stakeholders as a source of positive influence. On the downside, such a firm might have difficulty distinguishing when stakeholder contributions are less pertinent as it is deliberately more open to elements in operating environment (including stakeholders) as potential sources of learning and change, as the firm is oriented towards creating space for fundamental change (through boundary redefinition), opening issues to entirely new ways of thinking and addressing issues from whole-system perspective. A significant implication of this study is that potentially only those companies who see themselves as a leader are ultimately able to tap the innovation potential of stakeholder dialogue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metallic foreign bodies are rarely found in the maxillary sinus, and usually they have a dental origin.Potential complications related to foreign bodies include recurrent sinusitis, rhinolith formation, cutaneous fistula,chemical poisoning, facial neuralgic pain and even malignancies.Two main surgical approaches are currently used for the removal of foreign bodies in the maxillary sinus: the bone flap and the endoscopic sinus techniques. We are reporting two unusual cases of large high-velocity foreign bodies removed by a modified maxillary lateral antrotomy,with free bone flap repositioning and fixation with a titanium miniplate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kuparipinnan hapettuminen on viimevuosina ollut suosittu tutkimuskohde materiaalitieteissä kuparin laajan teollisuuskäytön vuoksi. Teollisuussovellusten, kuten suojaavien pintaoksidien kehittäminen vaatii kuitenkin syvällistä tuntemusta hapettumisprosessista ja toisaalta myös normaaliolosuhteissa materiaalissa esiintyvien hilavirheiden vaikutuksesta siihen. Tässä työssä keskitytäänkin tutkimaan juuri niitä mekanismeja, joilla erilaiset pintavirheet ja porrastettu pintarakenne vaikuttavathapen adsorptioprosessiin kuparipinnalla. Tutkimus on tehty käyttämällä laskennallisia menetelmiä sekä VASP- ja SIESTA-ohjelmistoja. Työssätutkittiin kemiallisia ja rakenteellisia virheitä Cu(100)-pinnalla, joka on reaktiivisin matalanMillerin indeksin pinta ja porrastetun pinnan tutkimuksessa käytettiin Cu(211)-pintaa, joka puolestaan on yksinkertainen, stabiili ja aiemmissa tutkimuksissa usein käytetty pintarakenne. Työssä tutkitut hilavirheet, adatomit, vähentävät molekyylin dissosiaatiota kuparipinnalla, kun taas vakanssit toimivat dissosiaation keskuksina. Kemiallisena epäpuhtautena käytetty hopeakerros ei estä kuparin hapettumista, sillä happi aiheuttaa mielenkiintoisen segregaatioilmiön, jossa hopeatyöntyy syvemmälle pinnassa jättäen kuparipinnan suojaamattomaksi. Porrastetulla pinnalla (100)-hollow on todennäköisin paikka molekyylin dissosiaatiolle, kun taas portaan bridge-paikka on suotuisin molekulaariselle adsorptiolle. Lisäksi kuparin steppipinnan todettiin olevan reaktiivisempi kuin tasaiset kuparipinnat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The central function of dendritic cells (DC) in inducing and preventing immune responses makes them ideal therapeutic targets for the induction of immunologic tolerance. In a rat in vivo model, we showed that dexamethasone-treated DC (Dex-DC) induced indirect pathway-mediated regulation and that CD4+CD25+ T cells were involved in the observed effects. The aim of the present study was to investigate the mechanisms underlying the acquired immunoregulatory properties of Dex-DC in the rat and human experimental systems. METHODS: After treatment with dexamethasone (Dex), the immunogenicity of Dex-DC was analyzed in T-cell proliferation and two-step hyporesponsiveness induction assays. After carboxyfluorescein diacetate succinimidyl ester labeling, CD4+CD25+ regulatory T-cell expansion was analyzed by flow cytometry, and cytokine secretion was measured by ELISA. RESULTS: In this study, we demonstrate in vitro that rat Dex-DC induced selective expansion of CD4+CD25+ regulatory T cells, which were responsible for alloantigen-specific hyporesponsiveness. The induction of regulatory T-cell division by rat Dex-DC was due to secretion of interleukin (IL)-2 by DC. Similarly, in human studies, monocyte-derived Dex-DC were also poorly immunogenic, were able to induce T-cell anergy in vitro, and expand a population of T cells with regulatory functions. This was accompanied by a change in the cytokine profile in DC and T cells in favor of IL-10. CONCLUSION: These data suggest that Dex-DC induced tolerance by different mechanisms in the two systems studied. Both rat and human Dex-DC were able to induce and expand regulatory T cells, which occurred in an IL-2 dependent manner in the rat system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diplomityön aiheena oli selvittää onko Suomessa GSM-tukiasemissa käytössä vaiheohjattuja antenniryhmiäja olisiko tällaisten antennien käyttöön kiinnostusta tai mitään es-teitä. Lähtökohtana tälle työlle oli ajatus GSM-tukiasema-antennista, jota voitaisiin kääntää tarpeen mukaan haluttuun suuntaan. Vaiheohjatut antenniryhmät mahdollistavat juuri tällaisen antennin keilan kääntämisen ja muokkaamisen elektronisesti, ilman kuluvia osia. Keilan muitakin ominaisuuksia voidaan säätää, kuten muotoa ja keilojen määrää. Nämä ominaisuudet mahdollistaisivat esimerkiksi ruuhkaisilla alueilla keilojen lisäämisen, jolloin alueen puhelujen välityskapasiteetti kasvaisi.Tarvittaessa voitaisiin myös keilan muotoa muuttaa. Esimerkiksi hätätilanteessasaadaan haluttu keila suunnattua tarkasti tietylle alueelle tai toiselle tukiasemalle ja näin varmistettua kuuluvuus. Myös huoltotoimenpiteet joissakin tapauksissa helpottuisivat. Etenkin vaikeakulkuisissa paikoissa sijaitsevien tukiasemien ensiapu, esimerkiksi antennin fyysisesti kääntyessä, voitaisiin hoitaa etänä kääntämällä pelkkää keilaa ja tässä tapauksessa kääntää antenni tukiaseman normaalin huollon yhteydessä. Suurimpia ongelmakohtia kyseisen tekniikan käyttöönotossa on ollut hinta, mutta muun muassa uusien valmistustekniikoiden avulla vaiheohjattujen antenniryhmien hintoja ollaan saatu pudotettua.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: The ASTRAL score was recently introduced as a prognostic tool for acute ischemic stroke. It predicts 3-month outcome reliably in both the derivation and the validation European cohorts. We aimed to validate the ASTRAL score in a Chinese stroke population and moreover to explore its prognostic value to predict 12-month outcome. METHODS: We applied the ASTRAL score to acute ischemic stroke patients admitted to 132 study sites of the China National Stroke Registry. Unfavorable outcome was assessed as a modified Rankin Scale score >2 at 3 and 12 months. Areas under the curve were calculated to quantify the prognostic value. Calibration was assessed by comparing predicted and observed probability of unfavorable outcome using Pearson correlation coefficient. RESULTS: Among 3755 patients, 1473 (39.7%) had 3-month unfavorable outcome. Areas under the curve for 3 and 12 months were 0.82 and 0.81, respectively. There was high correlation between observed and expected probability of unfavorable 3- and 12-month outcome (Pearson correlation coefficient: 0.964 and 0.963, respectively). CONCLUSIONS: ASTRAL score is a reliable tool to predict unfavorable outcome at 3 and 12 months after acute ischemic stroke in the Chinese population. It is a useful tool that can be readily applied in clinical practice to risk-stratify acute stroke patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: Pratiquement tous les cancers du colon contiennent des mutations dans la voie de signalisation de Wnt qui active constitutivement cette voie. Cette activation mène à la stabilisation de la β-catenine. La β-catenin est transportée dans le noyau ou elle active des gènes cible en interagissant avec le facteur de transcription de TCF/LEF. Des adénovirus qui peuvent sélectivement se répliquer dans les cellules tumorales sont les agents qui peuvent permettre la déstruction de la tumeur mais pas le tissu normal. In vitro, les adénovirus avec des sites d'attachement du facteur de transcription TCF dans les promoteurs de l'adénovirus montrent une sélectivité et une activité dans une large sélection de lignées cellulaires de cancer du colon. Au contraire, in vivo, quand les adénovirus modifiés sont injectés dans la circulation, ils sont moins efficaces à cause de leur fixation par le foie et à cause de l'absence d'expression du récepteur du Coxsackie-Adénovirus (CAR). Le but de ma thèse était de modifier la protéine principale de capside de l'adénovirus, fibre, pour augmenter l'infection des tumeurs du cancer du colon. La fibre de l'adénovirus est responsable de l'attachement aux cellules et de l'entrée virale. J'ai inséré un peptide RGD dans la boucle HI de la fibre qui dirige sélectivement le virus aux récepteurs des integrines. Les integrines sont surexprimées par les cellules du cancer du colon et l'endothélium des vesseaux de la tumeur. Le virus re-ciblé, vKH6, a montré une activité accrue dans toutes les lignées cellulaires de cancer du colon, tandis que la sélectivité était maintenue. In vivo, vKH6 était supérieur au virus avec une capside de type sauvage en retardant la croissance de la tumeur. Le virus s'est répliqué plus vite et dispersé graduellement dans la tumeur. Cet effet a été montré par hybridation in situ et par PCR quantitative. Cependant, la monothérapie avec le virus n'a pu retarder la croissance des cellules tumorales SW620 greffées que de 2 semaines, mais à cause des régions non infectées la tumeur n'a pas pu être éliminée. Bien que la combinaison avec les chimiothérapies conventionnelles soit d'intérêt potentiel, presque toutes interfèrent avec la réplication virale. Les drogues antiangiogéniques sont des agents anti-tumoraux efficaces et prometteurs. Ces drogues n'interfèrent pas avec le cycle de vie de l'adénovirus. RAD001 est un dérivé de la rapamycine et il inhibe mTOR, une protéine kinase de la voie de PI3K. RAD001 empêche la croissance des cellules et il a aussi des effets anti-angiogénique et immunosuppressifs. RAD001 in vitro n'affecte pas l'expression des gènes viraux et la production virale. La combinaison de VKH6 et RAD001 in vivo a un effet additif en retardant la croissance de la tumeur. Des nouveaux peptides plus efficaces dans le ciblage de l'adénovirus sont nécessaires pour augmenter l'infection des tumeurs. J'ai créé un système de recombinaison qui permettra la sélection de nouveaux peptides dans le contexte du génome de l'adénovirus. Summary Virtually all colon cancers have mutations in the Wnt signalling pathway which result in the constitutive activation of the pathway. This activation leads to stabilization of β-catenin. β-catenin enters the nucleus and activates its target genes through interaction with the TCF transcription factor. Selectively replicating adenoviruses are promising novel agents that can destroy the tumour but not the surrounding normal tissue. In vitro, adenoviruses with TCF binding sites in the early viral promoters show selectivity and activity in a broad panel of viruses but in vivo they are less effective due to the lack of expression of the Coxsackie-Adenovirus receptor (CAR). The aim of my thesis was to modify the major capsid protein of the adenovirus, fibre, to increase the infection of colon tumours. Fibre of adenovirus is responsible for the binding to cells and for the viral uptake. I inserted an RGD binding peptide into the HI loop of fibre that selectively targets the virus to integrins that are overexpressed on tumour cells and on tumour endothelium. The retargeted virus, vKH6, showed increased activity in all colon cancer cell lines while selectivity was maintained. In vivo, vKH6 is superior to a matched virus with a wild type capsid in delaying tumour growth. vKH6 replicates and gradually spreads within the tumour as shown by in situ hybridization and Q-PCR. The virus alone can delay the growth of SW620 xenografts by 2 weeks but due to uninfected tumour regions the tumour cannot be cured. Although combination with conventional chemotherapeutics is of potential interest, almost all of them interfere with the viral replication. Growing evidence supports that anti-angiogenic drugs are effective and promising anti-tumour agents. These drugs interfere less with the viral life cycle. RAD001 is a rapamycin derivative and it blocks mTOR, a protein kinase in the PI3K pathway. RAD001 inhibits cell growth and has strong anti-angiogenic and immunosuppressive effects. RAD001 in vitro does not affect viral gene expression and viral burst size. In vivo vKH6 and RAD001 have an additive effect in delaying tumour growth, but tumour growth is still not completely inhibited. To further increase tumour infection new tumour specific targeting peptides are needed. I created an adenovirus display library that will allow the selection of targeting peptides. This system may also facilitate the production of fibre modified viruses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The patent system was created for the purpose of promoting innovation by granting the inventors a legally defined right to exclude others in return for public disclosure. Today, patents are being applied and granted in greater numbers than ever, particularly in new areas such as biotechnology and information andcommunications technology (ICT), in which research and development (R&D) investments are also high. At the same time, the patent system has been heavily criticized. It has been claimed that it discourages rather than encourages the introduction of new products and processes, particularly in areas that develop quickly, lack one-product-one-patent correlation, and in which theemergence of patent thickets is characteristic. A further concern, which is particularly acute in the U.S., is the granting of so-called 'bad patents', i.e. patents that do not factually fulfil the patentability criteria. From the perspective of technology-intensive companies, patents could,irrespective of the above, be described as the most significant intellectual property right (IPR), having the potential of being used to protect products and processes from imitation, to limit competitors' freedom-to-operate, to provide such freedom to the company in question, and to exchange ideas with others. In fact, patents define the boundaries of ownership in relation to certain technologies. They may be sold or licensed on their ownor they may be components of all sorts of technology acquisition and licensing arrangements. Moreover, with the possibility of patenting business-method inventions in the U.S., patents are becoming increasingly important for companies basing their businesses on services. The value of patents is dependent on the value of the invention it claims, and how it is commercialized. Thus, most of them are worth very little, and most inventions are not worth patenting: it may be possible to protect them in other ways, and the costs of protection may exceed the benefits. Moreover, instead of making all inventions proprietary and seeking to appropriate as highreturns on investments as possible through patent enforcement, it is sometimes better to allow some of them to be disseminated freely in order to maximize market penetration. In fact, the ideology of openness is well established in the software sector, which has been the breeding ground for the open-source movement, for instance. Furthermore, industries, such as ICT, that benefit from network effects do not shun the idea of setting open standards or opening up their proprietary interfaces to allow everyone todesign products and services that are interoperable with theirs. The problem is that even though patents do not, strictly speaking, prevent access to protected technologies, they have the potential of doing so, and conflicts of interest are not rare. The primary aim of this dissertation is to increase understanding of the dynamics and controversies of the U.S. and European patent systems, with the focus on the ICT sector. The study consists of three parts. The first part introduces the research topic and the overall results of the dissertation. The second part comprises a publication in which academic, political, legal and business developments that concern software and business-method patents are investigated, and contentiousareas are identified. The third part examines the problems with patents and open standards both of which carry significant economic weight inthe ICT sector. Here, the focus is on so-called submarine patents, i.e. patentsthat remain unnoticed during the standardization process and then emerge after the standard has been set. The factors that contribute to the problems are documented and the practical and juridical options for alleviating them are assessed. In total, the dissertation provides a good overview of the challenges and pressures for change the patent system is facing,and of how these challenges are reflected in standard setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fatigue life assessment of weldedstructures is commonly based on the nominal stress method, but more flexible and accurate methods have been introduced. In general, the assessment accuracy is improved as more localized information about the weld is incorporated. The structural hot spot stress method includes the influence of macro geometric effects and structural discontinuities on the design stress but excludes the local features of the weld. In this thesis, the limitations of the structural hot spot stress method are discussed and a modified structural stress method with improved accuracy is developed and verified for selected welded details. The fatigue life of structures in the as-welded state consists mainly of crack growth from pre-existing cracks or defects. Crack growth rate depends on crack geometry and the stress state on the crack face plane. This means that the stress level and shape of the stress distribution in the assumed crack path governs thetotal fatigue life. In many structural details the stress distribution is similar and adequate fatigue life estimates can be obtained just by adjusting the stress level based on a single stress value, i.e., the structural hot spot stress. There are, however, cases for which the structural stress approach is less appropriate because the stress distribution differs significantly from the more common cases. Plate edge attachments and plates on elastic foundations are some examples of structures with this type of stress distribution. The importance of fillet weld size and weld load variation on the stress distribution is another central topic in this thesis. Structural hot spot stress determination is generally based on a procedure that involves extrapolation of plate surface stresses. Other possibilities for determining the structural hot spot stress is to extrapolate stresses through the thickness at the weld toe or to use Dong's method which includes through-thickness extrapolation at some distance from the weld toe. Both of these latter methods are less sensitive to the FE mesh used. Structural stress based on surface extrapolation is sensitive to the extrapolation points selected and to the FE mesh used near these points. Rules for proper meshing, however, are well defined and not difficult to apply. To improve the accuracy of the traditional structural hot spot stress, a multi-linear stress distribution is introduced. The magnitude of the weld toe stress after linearization is dependent on the weld size, weld load and plate thickness. Simple equations have been derived by comparing assessment results based on the local linear stress distribution and LEFM based calculations. The proposed method is called the modified structural stress method (MSHS) since the structural hot spot stress (SHS) value is corrected using information on weld size andweld load. The correction procedure is verified using fatigue test results found in the literature. Also, a test case was conducted comparing the proposed method with other local fatigue assessment methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Membrane filtration has become increasingly attractive in the processing of both foodand biotechnological products. However, the poor selectivity of the membranes and fouling are the critical factors limiting the development of UF systems for the specific fractionation of protein mixtures. This thesis gives an overview on fractionation of proteins from model protein solutions or from biological solutions. An attempt was made to improve the selectivity of the available membranes by modifying the membranes and by exploiting the different electrostatic interactions between the proteins and the membrane pore surfaces. Fractionation and UF behavior of proteins in the model solutions and in the corresponding biological solutions were compared. Characterization of the membranes and protein adsorptionto the membrane were investigated with combined flux and streaming potential studies. It has been shown that fouling of the membranes can be reduced using "self-rejecting" membranes at pH values where electrostatic repulsion is achieved between the membrane and the proteins in solution. This effect is best shown in UF of dilute single protein solutions at low ionic strengths and low pressures. Fractionation of model proteins in single, binary, and ternary solutionshas been carried out. The results have been compared to the results obtained from fractination of biological solutions. It was generally observed that fractination of proteins from biological solutions are more difficult to carry out owingto the presence of non studied protein components with different properties. Itcan be generally concluded that it is easier to enrich the smaller protein in the permeate but it is also possible to enrich the larger protein in the permeateat pH values close to the isoelectric point of the protein. It should be possible to find an optimal flux and modification to effectively improve the fractination of proteins even with very similar molar masses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Se hace referencia a la labor de evaluación del psicólogo forense en base a instrumentos psicométricos de uso clínico y a técnicas específicas para la medida de las habilidades cognitivas y personalidad. Al contrario de la gran cantidad de instrumentos de evaluación encontrados en la bibliografía anglosajona, en nuestro ámbito no existen apenas técnicas de evaluación psicológica válidas y fiables para ser aplicadas por el psicólogo forense, al margen de las habituales utilizadas en psicología clínica. Se ofrecen datos de varios cuestionarios para la evaluación de la personalidad desinhibida y antisocial obtenidos en muestras españolas de delincuentes y no delincuentes; los cuales pueden servir de referencia, en base a las teorías que los suscentan, para la valoración de delincuentes con fines de peritaje forense.