946 resultados para Towards Seamless Integration of Geoscience Models and Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La iododeoxyuridine (IdUrd), une fois marqué au 123I ou au 125I, est un agent potentiel pour des thérapies par rayonnements Auger. Cependant, des limitations restreignent son incorporation dans l'ADN. Afin d'augmenter celle-ci, différents groupes ont étudié la fluorodeoxyuridine (FdUrd), qui favorise l'incorporation d'analogue de la thymidine, sans toutefois parvenir à une toxicité associé plus importante. Dans notre approche, 3 lignées cellulaires de glioblastomes humains et une lignée de cancer ovarien ont été utilisées. Nous avons observé, 16 à 24 h après un court pré-traitement à la FdUrd, un fort pourcentage de cellules s'accumulant en phase S. Plus qu'une accumulation, c'était une synchronisation des cellules, celles-ci restant capables d'incorporer la radio-IdIrd et repartant dans le cycle cellulaire. De plus, ces cellules accumulées après un pré-traitement à la FdUrd étaient plus radio-sensibles. Après le même intervalle de 16 à 24 h suivant la FdUrd, les 4 lignées cellulaires ont incorporé des taux plus élevés de radio-IdUrd que sans ce prétraitement. Une corrélation temporelle entre l'accumulation des cellules en phase S et la forte incorporation de radio-IdUrd a ainsi été révélée 16 à 24 h après pré-traitement à la FdUrd. Les expériences de traitement par rayonnements Auger sur les cellules accumulées en phase S ont montré une augmentation significative de l'efficacité thérapeutique de 125I-IdUrd comparé aux cellules non prétraitées à la FdUrd. Une première estimation a permis de déterminer que 100 désintégrations de 125I par cellules étant nécessaires afin d'atteindre l'efficacité thérapeutique. De plus, p53 semble jouer un rôle dans l'induction directe de mort cellulaire après des traitements par rayonnements Auger, comme indiqué par les mesures par FACS d'apoptose et de nécrose 24 et 48 h après le traitement. Concernant les expériences in vivo, nous avons observé une incorporation marquée de la radio-IdUrd dans l'ADN après un pré-traitement à la FdUrd dans un model de carcinomatose ovarienne péritonéale. Une augmentation encore plus importante a été observée après injection intra-tumorale dans des transplants sous-cutanés de glioblastomes sur des souris nues. Ces modèles pourraient être utilisés pour de plus amples études de diffusion de radio-IdUrd et de thérapie par rayonnement Auger. En conclusion, ce travail montre une première application réussie de la FdUrd afin d'accroître l'efficacité de la radio-IdUrd par traitements aux rayonnements Auger. La synchronisation des cellules en phase S combinée avec la forte incorporation de radio-IdUrd dans l'ADN différées après un pré-traitement à la FdUrd ont montré le gain thérapeutique attendu in vitro. De plus, des études in vivo sont tout indiquées après les observations encourageantes d'incorporation de radio-IdUrd dans les models de transplants sous-cutanés de glioblastomes et de tumeurs péritonéales ovariennes. Summary Iododeoxyuridine (IdUrd), labelled with 123I or 125I, could be a potential Auger radiation therapy agent. However, limitations restrict its DNA incorporation in proliferating cells. Therefore, fluorodeoxyuridine (FdUrd), which favours incorporation of thymidine analogues, has been studied by different groups in order to increase radio-IdUrd DNA incorporation, however therapeutic efficacy increase could not be reached. In our approach, 3 human glioblastoma cell lines with different p53 expression and one ovarian cancer line were pre-treated with various FdUrd conditions. We observed a high percentage of cells accumulating in early S phase 16 to 24 h after a short and non-toxic FdUrd pre-treatment. More than an accumulation, this was a synchronization, cells remaining able to incorporate radio-IdUrd and re-entering the cell cycle. Furthermore, the S phase accumulated cells post FdUrd pre-treatment were more radiosensitive. After the same delay of 16 to 24 h post FdUrd pre-treatment, the 4 cell lines were incorporating higher rates of radio-IdUrd compared with untreated cells. A time correlation between S phase accumulation and high radio-IdUrd incorporation was therefore revealed 16 to 24 h post FdUrd pre-treatment. Auger radiation treatment experiments performed on S phase enriched cells showed a significant increase of killing efficacy of 125I-IdUrd compared with cells not pre-treated with FdUrd. A first estimation indicates further that about 100 125I decays were required to reach killing in the targeted cells. Moreover, p53 might play a role on the direct induction of cell death pathways after Auger radiation treatments, as indicated by differential apoptosis and necrosis induction measured by FACS 24 and 48 h after treatment initiation. Concerning in vivo results, we observed a marked DNA incorporation increase of radio-IdUrd after FdUrd pre-treatment in peritoneal carcinomatosis in SCID mice. Even higher incorporation increase was observed after intra-tumoural injection of radio-IdUrd in subcutaneous glioblastoma transplants in nude mice. These tumour models might be further useful for diffusion of radio-IdUrd and Auger radiation therapy studies. In conclusion, these data show a first successful application of thymidine synthesis inhibition able to increase the efficacy of radio-IdUrd Auger radiation treatment. The S phase synchronization combined with a high percentage DNA incorporation of radio-IdUrd delayed post FdUrd pre-treatment provided the expected therapeutic gain in vitro. Further in vivo studies are indicated after the observations of encouraging radio-IdUrd uptake experiments in glioblastoma subcutaneous xenografts and in an ovarian peritoneal carcinomatosis model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work deals with the cooling of high-speed electric machines, such as motors and generators, through an air gap. It consists of numerical and experimental modelling of gas flow and heat transfer in an annular channel. Velocity and temperature profiles are modelled in the air gap of a high-speed testmachine. Local and mean heat transfer coefficients and total friction coefficients are attained for a smooth rotor-stator combination at a large velocity range. The aim is to solve the heat transfer numerically and experimentally. The FINFLO software, developed at Helsinki University of Technology, has been used in the flow solution, and the commercial IGG and Field view programs for the grid generation and post processing. The annular channel is discretized as a sector mesh. Calculation is performed with constant mass flow rate on six rotational speeds. The effect of turbulence is calculated using three turbulence models. The friction coefficient and velocity factor are attained via total friction power. The first part of experimental section consists of finding the proper sensors and calibrating them in a straight pipe. After preliminary tests, a RdF-sensor is glued on the walls of stator and rotor surfaces. Telemetry is needed to be able to measure the heat transfer coefficients at the rotor. The mean heat transfer coefficients are measured in a test machine on four cooling air mass flow rates at a wide Couette Reynolds number range. The calculated values concerning the friction and heat transfer coefficients are compared with measured and semi-empirical data. Heat is transferred from the hotter stator and rotor surfaces to the coolerair flow in the air gap, not from the rotor to the stator via the air gap, althought the stator temperature is lower than the rotor temperature. The calculatedfriction coefficients fits well with the semi-empirical equations and precedingmeasurements. On constant mass flow rate the rotor heat transfer coefficient attains a saturation point at a higher rotational speed, while the heat transfer coefficient of the stator grows uniformly. The magnitudes of the heat transfer coefficients are almost constant with different turbulence models. The calibrationof sensors in a straight pipe is only an advisory step in the selection process. Telemetry is tested in the pipe conditions and compared to the same measurements with a plain sensor. The magnitudes of the measured data and the data from the semi-empirical equation are higher for the heat transfer coefficients than thenumerical data considered on the velocity range. Friction and heat transfer coefficients are presented in a large velocity range in the report. The goals are reached acceptably using numerical and experimental research. The next challenge is to achieve results for grooved stator-rotor combinations. The work contains also results for an air gap with a grooved stator with 36 slots. The velocity field by the numerical method does not match in every respect the estimated flow mode. The absence of secondary Taylor vortices is evident when using time averagednumerical simulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To determine whether a mono-, bi- or tri-exponential model best fits the intravoxel incoherent motion (IVIM) diffusion-weighted imaging (DWI) signal of normal livers. MATERIALS AND METHODS: The pilot and validation studies were conducted in 38 and 36 patients with normal livers, respectively. The DWI sequence was performed using single-shot echoplanar imaging with 11 (pilot study) and 16 (validation study) b values. In each study, data from all patients were used to model the IVIM signal of normal liver. Diffusion coefficients (Di ± standard deviations) and their fractions (fi ± standard deviations) were determined from each model. The models were compared using the extra sum-of-squares test and information criteria. RESULTS: The tri-exponential model provided a better fit than both the bi- and mono-exponential models. The tri-exponential IVIM model determined three diffusion compartments: a slow (D1 = 1.35 ± 0.03 × 10(-3) mm(2)/s; f1 = 72.7 ± 0.9 %), a fast (D2 = 26.50 ± 2.49 × 10(-3) mm(2)/s; f2 = 13.7 ± 0.6 %) and a very fast (D3 = 404.00 ± 43.7 × 10(-3) mm(2)/s; f3 = 13.5 ± 0.8 %) diffusion compartment [results from the validation study]. The very fast compartment contributed to the IVIM signal only for b values ≤15 s/mm(2) CONCLUSION: The tri-exponential model provided the best fit for IVIM signal decay in the liver over the 0-800 s/mm(2) range. In IVIM analysis of normal liver, a third very fast (pseudo)diffusion component might be relevant. KEY POINTS: ? For normal liver, tri-exponential IVIM model might be superior to bi-exponential ? A very fast compartment (D = 404.00 ± 43.7 × 10 (-3)  mm (2) /s; f = 13.5 ± 0.8 %) is determined from the tri-exponential model ? The compartment contributes to the IVIM signal only for b ≤ 15 s/mm (2.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study general models of holographic superconductivity parametrized by four arbitrary functions of a neutral scalar field of the bulk theory. The models can accommodate several features of real superconductors, like arbitrary critical temperatures and critical exponents in a certain range, and perhaps impurities or boundary or thickness effects. We find analytical expressions for the critical exponents of the general model and show that they satisfy the Rushbrooke identity. An important subclass of models exhibit second order phase transitions. A study of the specific heat shows that general models can also describe holographic superconductors undergoing first, second and third (or higher) order phase transitions. We discuss how small deformations of the HHH model can lead to the appearance of resonance peaks in the conductivity, which increase in number and become narrower as the temperature is gradually decreased, without the need for tuning mass of the scalar to be close to the Breitenlohner-Freedman bound. Finally, we investigate the inclusion of a generalized ¿theta term¿ producing Hall effect without magnetic field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkielma tarkastelee vapaa alue konseptia osana yritysten kansainvälistä toimitusketjua. Tarkoituksena on löytää keinoja, millä tavoin vapaa alueen houkuttelevuutta voidaan lisätä yritysten näkökulmasta ja millaista liiketoimintaa yritysten on vapaa alueella mahdollista harjoittaa. Tutkielmassa etsitään tekijöitä, jotka vaikuttavat vapaa alueen menestykseen ja jotka voisivat olla sovellettavissa Kaakkois-Suomen ja Venäjän raja-alueelle ottaen huomioon vallitsevat olosuhteet ja lainsäädäntö rajoittavina tekijöinä. Menestystekijöitä ja liiketoimintamalleja haetaan tutkimalla ja analysoimalla lyhyesti muutamia olemassa olevia ja toimivia vapaa alueita. EU tullilain harmonisointi ja kansainvälisen kaupan vapautuminen vähentää vapaa alueen perinteistä merkitystä tullivapaana alueena. Sen sijaan vapaa alueet toimivat yhä enenevissä määrin logistisina keskuksina kansainvälisessä kaupassa ja tarjoavat palveluita, joiden avulla yritykset voivat parantaa logistista kilpailukykyään. Verkostoituminen, satelliitti-ratkaisut ja yhteistoiminta ovat keinoja, millä Kaakkois-Suomen alueen eri logistiikkapalvelujen tarjoajat voivat parantaa suorituskykyään ja joustavuutta kansainvälisessä toimitusketjussa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn päätavoitteena oli selvittää hinnan ja kilpailutilanteen vaikutusta matkaviestinnän diffuusioon. Työn empiirinen osuus tarkasteli matkapuhelinliittymien hinnan vaikutusta liittymien diffuusioon sekä sitä, miten alan kilpailu on vaikuttanut matkaviestinnän hintatasoon. Työssä analysoitiin myös matkaviestinnän kilpailutilannetta Suomen markkinoilla. Tutkimuksen empiirinen aineisto kerättiin toissijaisista lähteistä, esimerkiksi EMC-tietokannasta. Tutkimus oli luonteeltaan kvantitatiivinen.Empiirisessä osassa käytetyt mallit oli muodostettu aikaisempien tutkimuksien perusteella. Regressioanalyysiä käytettiin arvioitaessa hinnan vaikutusta diffuusionopeuteen ja mahdollisten omaksujien määrään. Regressioanalyysissä sovellettiin ei-lineaarista mallia.Tutkimustulokset osoittivat, että tasaisesti laskevilla matkapuhelinliittymien sekä matkapuhelimien hinnoilla ei ole merkittävää vaikutusta matkaviestinnän diffuusioon. Myöskään kilpailutilanne ei ole vaikuttanut paljon matkaviestinnän yleiseen hintatasoon. Työn tulosten perusteella voitiin antaa myös muutamia toimenpide-ehdotuksia jatkotutkimuksia varten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn päätavoitteena oli tutkia mobiilipalveluita ja langattomia sovelluksia Suomen terveydenhuollon sektorilla. Tutkimus havainnollistaa avain-alueita, missä mobiilipalvelut ja langattomat sovellukset voivat antaa lisäarvoa perinteiseen lääketieteen harjoittamiseen, ja selvittää, mitkä ovat tähän kehitykseen liittyvät suurimmat ongelmat ja uhkat sekä tutkimustuloksiin pohjautuvat mahdolliset palvelut ja sovellukset 5-10 vuoden kuluttua. Tutkimus oli luonteeltaan kvalitatiivinen ja tutkimuksen toteuttamiseen valittiin tulevaisuudentutkimus ja erityisesti yksi sen menetelmistä, delfoi-menetelmä. Tutkimuksen aineisto kerättiin kahdelta puolistrukturoidulta haastattelukierrokselta. Työn empiirinen osuus keskittyi kuvailemaan Suomen terveydenhuollon sektoria, siinä meneillään olevia projekteja sekä teknisiä esteitä. Lisäksi pyrittiin vastaamaan tutkimuksen pääkysymykseen. Tutkimustulokset osoittivat, että tärkeät alueet, joihin langaton kommunikaatio tulisi vaikuttamaan merkittävästi, ovat ensiaputoiminta, kroonisten potilaiden etämonitorointi, välineiden kehittäminen langattomaan kommunikaatioon kotihoidon parantamiseksi ja uusien toimintamallien luomiseksi sekä lääketieteellinen yhteistyö jakamalla terveydenhuoltoon liittyvät informaation lähteet. Työn tulosten perusteellavoitiin antaa myös muutamia toimenpide-ehdotuksia jatkotutkimuksia varten.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The objective was to determine the risk of stroke associated with subclinical hypothyroidism. DATA SOURCES AND STUDY SELECTION: Published prospective cohort studies were identified through a systematic search through November 2013 without restrictions in several databases. Unpublished studies were identified through the Thyroid Studies Collaboration. We collected individual participant data on thyroid function and stroke outcome. Euthyroidism was defined as TSH levels of 0.45-4.49 mIU/L, and subclinical hypothyroidism was defined as TSH levels of 4.5-19.9 mIU/L with normal T4 levels. DATA EXTRACTION AND SYNTHESIS: We collected individual participant data on 47 573 adults (3451 subclinical hypothyroidism) from 17 cohorts and followed up from 1972-2014 (489 192 person-years). Age- and sex-adjusted pooled hazard ratios (HRs) for participants with subclinical hypothyroidism compared to euthyroidism were 1.05 (95% confidence interval [CI], 0.91-1.21) for stroke events (combined fatal and nonfatal stroke) and 1.07 (95% CI, 0.80-1.42) for fatal stroke. Stratified by age, the HR for stroke events was 3.32 (95% CI, 1.25-8.80) for individuals aged 18-49 years. There was an increased risk of fatal stroke in the age groups 18-49 and 50-64 years, with a HR of 4.22 (95% CI, 1.08-16.55) and 2.86 (95% CI, 1.31-6.26), respectively (p trend 0.04). We found no increased risk for those 65-79 years old (HR, 1.00; 95% CI, 0.86-1.18) or ≥ 80 years old (HR, 1.31; 95% CI, 0.79-2.18). There was a pattern of increased risk of fatal stroke with higher TSH concentrations. CONCLUSIONS: Although no overall effect of subclinical hypothyroidism on stroke could be demonstrated, an increased risk in subjects younger than 65 years and those with higher TSH concentrations was observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Past temperature variations are usually inferred from proxy data or estimated using general circulation models. Comparisons between climate estimations derived from proxy records and from model simulations help to better understand mechanisms driving climate variations, and also offer the possibility to identify deficiencies in both approaches. This paper presents regional temperature reconstructions based on tree-ring maximum density series in the Pyrenees, and compares them with the output of global simulations for this region and with regional climate model simulations conducted for the target region. An ensemble of 24 reconstructions of May-to-September regional mean temperature was derived from 22 maximum density tree-ring site chronologies distributed over the larger Pyrenees area. Four different tree-ring series standardization procedures were applied, combining two detrending methods: 300-yr spline and the regional curve standardization (RCS). Additionally, different methodological variants for the regional chronology were generated by using three different aggregation methods. Calibration verification trials were performed in split periods and using two methods: regression and a simple variance matching. The resulting set of temperature reconstructions was compared with climate simulations performed with global (ECHO-G) and regional (MM5) climate models. The 24 variants of May-to-September temperature reconstructions reveal a generally coherent pattern of inter-annual to multi-centennial temperature variations in the Pyrenees region for the last 750 yr. However, some reconstructions display a marked positive trend for the entire length of the reconstruction, pointing out that the application of the RCS method to a suboptimal set of samples may lead to unreliable results. Climate model simulations agree with the tree-ring based reconstructions at multi-decadal time scales, suggesting solar variability and volcanism as the main factors controlling preindustrial mean temperature variations in the Pyrenees. Nevertheless, the comparison also highlights differences with the reconstructions, mainly in the amplitude of past temperature variations and in the 20th century trends. Neither proxy-based reconstructions nor model simulations are able to perfectly track the temperature variations of the instrumental record, suggesting that both approximations still need further improvements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To determine the effect of nonadherence to antiretroviral therapy (ART) on virologic failure and mortality in naive individuals starting ART. DESIGN: Prospective observational cohort study. METHODS: Eligible individuals enrolled in the Swiss HIV Cohort Study, started ART between 2003 and 2012, and provided adherence data on at least one biannual clinical visit. Adherence was defined as missed doses (none, one, two, or more than two) and percentage adherence (>95, 90-95, and <90) in the previous 4 weeks. Inverse probability weighting of marginal structural models was used to estimate the effect of nonadherence on viral failure (HIV-1 viral load >500 copies/ml) and mortality. RESULTS: Of 3150 individuals followed for a median 4.7 years, 480 (15.2%) experienced viral failure and 104 (3.3%) died, 1155 (36.6%) reported missing one dose, 414 (13.1%) two doses and, 333 (10.6%) more than two doses of ART. The risk of viral failure increased with each missed dose (one dose: hazard ratio [HR] 1.15, 95% confidence interval 0.79-1.67; two doses: 2.15, 1.31-3.53; more than two doses: 5.21, 2.96-9.18). The risk of death increased with more than two missed doses (HR 4.87, 2.21-10.73). Missing one to two doses of ART increased the risk of viral failure in those starting once-daily (HR 1.67, 1.11-2.50) compared with those starting twice-daily regimens (HR 0.99, 0.64-1.54, interaction P = 0.09). Consistent results were found for percentage adherence. CONCLUSION: Self-report of two or more missed doses of ART is associated with an increased risk of both viral failure and death. A simple adherence question helps identify patients at risk for negative clinical outcomes and offers opportunities for intervention.