882 resultados para further steps


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE We have previously shown that retinal stem cells (RSCs) can be isolated from the radial glia population of the newborn mouse retina (Angénieux et al., 2006). These RSCs have a great capacity to renew and to generate a large number of neurons including cells differentiated towards the photoreceptor lineage (Mehri-Soussi et al., 2006). However, recent published results from our lab revealed that such cells have a poor integration and survival rate after grafting. The uncontrolled environment of a retina seems to prevent good integration and survival after grafting in vivo. To bypass this problem, we are evaluating the possibility of generating in vitro a hemi-retinal tissue before transplantation. METHODS RSC were expanded and cells passaged <10 were seeded in a solution containing poly-ethylene-glycol (PEG) polymer based hydrogels crosslinked with peptides that are chosen to be substrates for matrix metalloproteinases. Various doses of cross linkers peptides allowing connections between PEG polymers were tested. Different growth factors were studied to stimulate cell proliferation and differentiation. RESULTS Cells survived only in the presence of EGF and FGF-2 and generated colonies with a sphere shape. No cells migrated within the gel. To improve the migration and the repartition of the cells in the gels, the integrin ligand RGDSP was added into the gel. In the presence of FGF-2 and EGF, newly formed cell clusters appeared by cell proliferation within several days, but again no outspreading of cells was observed. No difference was even seen when the stiffness of the hydrogels or the concentration of the integrin ligand RGDSP were changed. However, our preliminary results show that RSCs still form spheres when laminin is entrapped in the gel, but they started to spread out having a neuronal morphology after around 2 weeks. The neuronal population was assessed by the presence of the neuronal marker b-tubulin-III. This differentiation was achieved after successive steps of stimulations including FGF-2 and EGF, and then only FGF-2. Glial cells were also present. Further characterizations are under process. CONCLUSIONS RSC can be grown in 3D. Preliminary results show that neuronal cell phenotype acquisition can be instructed by exogenous stimulations and factors linked to the gel. Further developments are necessary to form a homogenous tissue containing retinal cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selostus: WTO:n kauppaneuvotteluissa esitettyjen tuontitullien alentamisvaihtoehtojen vaikutukset EU:n sokerimarkkinoihin

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To complement the existing treatment guidelines for all tumour types, ESMO organises consensus conferences to focus on specific issues in each type of tumour. The 2nd ESMO Consensus Conference on Lung Cancer was held on 11-12 May 2013 in Lugano. A total of 35 experts met to address several questions on non-small-cell lung cancer (NSCLC) in each of four areas: pathology and molecular biomarkers, first-line/second and further lines of treatment in advanced disease, early-stage disease and locally advanced disease. For each question, recommendations were made including reference to the grade of recommendation and level of evidence. This consensus paper focuses on first line/second and further lines of treatment in advanced disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The flourishing number of publications on the use of isotope ratio mass spectrometry (IRMS) in forensicscience denotes the enthusiasm and the attraction generated by this technology. IRMS has demonstratedits potential to distinguish chemically identical compounds coming from different sources. Despite thenumerous applications of IRMS to a wide range of forensic materials, its implementation in a forensicframework is less straightforward than it appears. In addition, each laboratory has developed its ownstrategy of analysis on calibration, sequence design, standards utilisation and data treatment without aclear consensus.Through the experience acquired from research undertaken in different forensic fields, we propose amethodological framework of the whole process using IRMS methods. We emphasize the importance ofconsidering isotopic results as part of a whole approach, when applying this technology to a particularforensic issue. The process is divided into six different steps, which should be considered for a thoughtfuland relevant application. The dissection of this process into fundamental steps, further detailed, enablesa better understanding of the essential, though not exhaustive, factors that have to be considered in orderto obtain results of quality and sufficiently robust to proceed to retrospective analyses or interlaboratorycomparisons.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un treball final de carrera (TFC) és un dels últims passos de molts estudis, com per exemple els graus d'enginyeria. Els estudiants solen presentar els resultats del seu projecte final en una presentació pública, on un comitè avalua el seu treball. Els estudiants, en aquesta activitat, s'ocupen de competències com ara: fer presentacions orals efectives en entorns públics, en una situació estressant. Però, es pot assolir aquesta competència en un entorn virtual?A la Universitat Oberta de Catalunya (UOC), una universitat 100% virtual, es proposa una solució basada en les presentacions en vídeo que es pugen en una eina, Present@, que els permet compartir les presentacions i permet fer preguntes i escriure les respostes en un entorn obert.Present@ ofereix una eina millorada per pujar vídeos, que simplifica el procés i fa que sigui apropiada per als estudiants de qualsevol àrea de coneixement. Però treballar amb vídeos va un pas més enllà i requereix també la tecnologia adequada per donar suport a aquest tipus d'arxius. Així, s'ha afegit al Present@ un servei d'"streaming", Kaltura. Amb aquest servei els estudiants poden pujar fàcilment gairebé qualsevol format de vídeo i és possible fer comentaris en vídeo.131 estudiants han estat usant el Present@ durant 4 semestres. Per avaluar l'eina, els estudiants han contestat un qüestionari i de les respostes rebudes es conclou que aquest enfocament permet als estudiants virtuals adquirir la major part de les competències relacionades amb el TFC i, en concret, amb la dissertació virtual en entorns virtuals. Potser l'única característica a què no s'enfronten és la tensió de les preguntes.És important assenyalar que, gràcies a les millores introduïdes, el Present@ està present, actualment en més de 100 aules a UOC i s'utilitza no només per al TFC, sinó també per explicar assignatures per ajudar els estudiants

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Es descriuen les activitats realitzades per Ubuntu de 2000 a 2002

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A highly sensitive ultra-high performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS) method was developed for the quantification of buprenorphine and its major metabolite norbuprenorphine in human plasma. In order to speed up the process and decrease costs, sample preparation was performed by simple protein precipitation with acetonitrile. To the best of our knowledge, this is the first application of this extraction technique for the quantification of buprenorphine in plasma. Matrix effects were strongly reduced and selectivity increased by using an efficient chromatographic separation on a sub-2μm column (Acquity UPLC BEH C18 1.7μm, 2.1×50mm) in 5min with a gradient of ammonium formate 20mM pH 3.05 and acetonitrile as mobile phase at a flow rate of 0.4ml/min. Detection was made using a tandem quadrupole mass spectrometer operating in positive electrospray ionization mode, using multiple reaction monitoring. The procedure was fully validated according to the latest Food and Drug Administration guidelines and the Société Française des Sciences et Techniques Pharmaceutiques. Very good results were obtained by using a stable isotope-labeled internal standard for each analyte, to compensate for the variability due to the extraction and ionization steps. The method was very sensitive with lower limits of quantification of 0.1ng/ml for buprenorphine and 0.25ng/ml for norbuprenorphine. The upper limit of quantification was 250ng/ml for both drugs. Trueness (98.4-113.7%), repeatability (1.9-7.7%), intermediate precision (2.6-7.9%) and internal standard-normalized matrix effects (94-101%) were in accordance with international recommendations. The procedure was successfully used to quantify plasma samples from patients included in a clinical pharmacogenetic study and can be transferred for routine therapeutic drug monitoring in clinical laboratories without further development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tulevaisuuden hahmottamisen merkitys heikkojen signaalien avulla on korostunut viime vuosien aikana merkittävästi,koska yrityksen liiketoimintaympäristössä tapahtuvia muutoksia on ollut yhä vaikeampaa ennustaa historian perusteella. Liiketoimintaympäristössä monien muutoksien merkkejä on ollut nähtävissä, mutta niitä on ollut vaikea havaita. Heikkoja signaaleja tunnistamalla ja keräämällä sekä reagoimalla tilanteeseen riittävän ajoissa, on mahdollista saavuttaa ylivoimaista kilpailuetua. Kirjallisuustutkimus keskittyy heikkojen signaalien tunnistamisen haasteisiin liiketoimintaympäristöstä, signaalien ja informaation kehittymiseen sekä informaation hallintaan organisaatiossa. Kiinnostus näihin perustuu tarpeeseen määritellä heikkojen signaalien tunnistamiseen vaadittava prosessi, jonka avulla heikot signaalit voidaan huomioida M-real Oyj:n päätöksenteossa. Kirjallisuustutkimus osoittaa selvästi sen, että heikkoja signaaleita on olemassa ja niitä pystytään tunnistamaan liiketoimintaympäristöstä. Signaaleja voidaan rikastuttaa yrityksessä olevalla tietämyksellä ja hyödyntää edelleen päätöksenteossa. Vertailtaessa sekä kirjallisuustutkimusta että empiiristä tutkimusta tuli ilmi selkeästi tiedon moninaisuus; määrä,laatu ja tiedonsaannin oikea-aikaisuus päätöksenteossa. Tutkimuksen aikana kehittyi prosessimalli tiedon suodattamiselle, luokittelulle ja heikkojen signaalien tunnistamiselle. Työn edetessä prosessimalli kehittyi osaksi tässä työssä kehitettyä kokonaisuutta 'Weak Signal Capturing' -työkalua. Monistamalla työkalua voidaan kerätä heikkoja signaaleja eri M-realin liiketoiminnan osa-alueilta. Tietoja systemaattisesti kokoamalla voidaan kartoittaa tulevaisuutta koko M-realille.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The thiomethyl group of S-adenosylmethionine is often recycled as methionine from methylthioadenosine. The corresponding pathway has been unravelled in Bacillus subtilis. However methylthioadenosine is subjected to alternative degradative pathways depending on the organism. RESULTS: This work uses genome in silico analysis to propose methionine salvage pathways for Klebsiella pneumoniae, Leptospira interrogans, Thermoanaerobacter tengcongensis and Xylella fastidiosa. Experiments performed with mutants of B. subtilis and Pseudomonas aeruginosa substantiate the hypotheses proposed. The enzymes that catalyze the reactions are recruited from a variety of origins. The first, ubiquitous, enzyme of the pathway, MtnA (methylthioribose-1-phosphate isomerase), belongs to a family of proteins related to eukaryotic intiation factor 2B alpha. mtnB codes for a methylthioribulose-1-phosphate dehydratase. Two reactions follow, that of an enolase and that of a phosphatase. While in B. subtilis this is performed by two distinct polypeptides, in the other organisms analyzed here an enolase-phosphatase yields 1,2-dihydroxy-3-keto-5-methylthiopentene. In the presence of dioxygen an aci-reductone dioxygenase yields the immediate precursor of methionine, ketomethylthiobutyrate. Under some conditions this enzyme produces carbon monoxide in B. subtilis, suggesting a route for a new gaseous mediator in bacteria. Ketomethylthiobutyrate is finally transaminated by an aminotransferase that exists usually as a broad specificity enzyme (often able to transaminate aromatic aminoacid keto-acid precursors or histidinol-phosphate). CONCLUSION: A functional methionine salvage pathway was experimentally demonstrated, for the first time, in P. aeruginosa. Apparently, methionine salvage pathways are frequent in Bacteria (and in Eukarya), with recruitment of different polypeptides to perform the needed reactions (an ancestor of a translation initiation factor and RuBisCO, as an enolase, in some Firmicutes). Many are highly dependent on the presence of oxygen, suggesting that the ecological niche may play an important role for the existence and/or metabolic steps of the pathway, even in phylogenetically related bacteria. Further work is needed to uncover the corresponding steps when dioxygen is scarce or absent (this is important to explore the presence of the pathway in Archaea). The thermophile T. tengcongensis, that thrives in the absence of oxygen, appears to possess the pathway. It will be an interesting link to uncover the missing reactions in anaerobic environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työn tarkoituksena on tuottaa liikkeenjohdon konsulttiyritykselle uusi työmenetelmä, jonka avulla se voi vetää kehitysprojekteja joissa asiakasyritysten teollisia palveluprosesseja parannetaan. Prosessiparannusten tulisi tuottaa selvää hyötyä asiakkaille sekä palveluntarjoajan henkilöstölle ja johdolle pian sen jälkeen kun ajanmukaistetut prosessit on menestyksellisesti otettu käyttöön. Menetelmän luonti käynnistyy kirjallisuuskatsauksella, jossa käsitellään aiheita kuten palvelut, teolliset palvelut ja liiketoimintaprosessien uudelleensuunnittelu. Menetelmän luonnin vaatimukset määritellään. Asiakasprojekti, jossa menetelmää koekäytetään, esitellään. Menetelmä esitellään. Se on ylhäältä alaspäin muodostettu kehitysprosessin vetäjän opas. Päätavoitteet asetettiin ensin. Niitä tukevat alitavoitteetasetettiin seuraavaksi. Työohjeet luotiin siten, että tavoitteiden saavuttaminen mahdollistuisi. Samalla kehitettiin menetelmää tukevia työkaluja. Alustavat työohjeet ja työkalut jalostuivat menetelmän koekäytön aikana nykyiseen muotoonsa.Menetelmän laatua arvioidaan koekäytön jälkeen asetettujen tavoitteiden ja saavutettujen tulosten eroja vertaamalla. Valmistumisen jälkeen toteutettavat menetelmän jatkokehitystoimenpiteet esitellään.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The markets of biomass for energy are developing rapidly and becoming more international. A remarkable increase in the use of biomass for energy needs parallel and positive development in several areas, and there will be plenty of challenges to overcome. The main objective of the study was to clarify the alternative future scenarios for the international biomass market until the year 2020, and based on the scenario process, to identify underlying steps needed towards the vital working and sustainable biomass market for energy purposes. Two scenario processes were conducted for this study. The first was carried out with a group of Finnish experts and thesecond involved an international group. A heuristic, semi-structured approach, including the use of preliminary questionnaires as well as manual and computerised group support systems (GSS), was applied in the scenario processes.The scenario processes reinforced the picture of the future of international biomass and bioenergy markets as a complex and multi-layer subject. The scenarios estimated that the biomass market will develop and grow rapidly as well as diversify in the future. The results of the scenario process also opened up new discussion and provided new information and collective views of experts for the purposes of policy makers. An overall view resulting from this scenario analysis are the enormous opportunities relating to the utilisation of biomass as a resource for global energy use in the coming decades. The scenario analysis shows the key issues in the field: global economic growth including the growing need for energy, environmental forces in the global evolution, possibilities of technological development to solve global problems, capabilities of the international community to find solutions for global issues and the complex interdependencies of all these driving forces. The results of the scenario processes provide a starting point for further research analysing the technological and commercial aspects related the scenarios and foreseeing the scales and directions of biomass streams.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Life cycle analysis (LCA) is a comprehensive method for assessing the environmental impact of a product or an activity over its entire life cycle. The purpose of conducting LCA studies varies from one application to another. Different applications use LCA for different purposes. In general, the main aim of using LCA is to reduce the environmental impact of products through guiding the decision making process towards more sustainable solutions. The most critical phase in an LCA study is the Life Cycle Impact Assessment (LCIA) where the life cycle inventory (LCI) results of the considered substances related to the study of a certain system are transformed into understandable impact categories that represent the impact on the environment. In this research work, a general structure clarifying the steps that shall be followed ir order to conduct an LCA study effectively is presented. These steps are based on the ISO 14040 standard framework. In addition, a survey is done on the most widely used LCIA methodologies. Recommendations about possible developments and suggetions for further research work regarding the use of LCA and LCIA methodologies are discussed as well.