903 resultados para multi-language environment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airborne microbial products have been reported to promote immune responses that suppress asthma, yet how these beneficial effects take place remains controversial and poorly understood. We have found that pulmonary exposure with the bacterium Escherichia coli leads to a suppression of allergic airway inflammation, characterized by reduced airway-hyperresponsiveness, eosinophilia and cytokine production by T cells in the lung. This immune modulation was neither mediated by the induction of a Th1 response nor regulatory T cells; was dependent on TLR-4 but did not involve TLR-desensitization. Dendritic cell migration to the draining lymph nodes and subsequent activation of T cells was unaffected by prior exposure to E.coli indicating that the immunomodulation was limited to the lung environment. In non-treated control mice ovalbumin was primarily presented by airway CD11b+ CD11c+ DCs expressing high levels of MHC class II molecules whilst the DCs in E.coli-treated mice displayed a less activated phenotype and had impaired antigen presentation capacity. Consequently, in situ Th2 cytokine production by ovalbuminspecific effector T cells recruited to the airways was significantly reduced. The suppression of airways hyper responsiveness was mediated through the recruitment of IL-17-producing gd-T cells; however, the suppression of dendritic cells and T cells was mediated through a distinct mechanism that could not be overcome by the local administration of activated dendritic cells, or by the in vivo administration of TNF-alpha. Taken together, these data reveal a novel multi-component immunoregulatory pathway that acts to protect the airways from allergic inflammation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a novel dissimilarity framework to analyze spatial patterns of species diversity and illustrate it with alien plant invasions in Northern Portugal. We used this framework to test the hypothesis that patterns of alien invasive plant species richness and composition are differently affected by differences in climate, land use and landscape connectivity (i.e. Geographic distance as a proxy and vectorial objects that facilitate dispersal such as roads and rivers) between pairs of localities at the regional scale. We further evaluated possible effects of plant life strategies (Grime's C-S-R) and residence time. Each locality consisted of a 1 km(2) landscape mosaic in which all alien invasive species were recorded by visiting all habitat types. Multi-model inference revealed that dissimilarity in species richness is more influenced by environmental distance (particularly climate), whereas geographic distance (proxies for dispersal limitations) is more important to explain dissimilarity in species composition, with a prevailing role for ecotones and roads. However, only minor differences were found in the responses of the three C-S-R strategies. Some effect of residence time was found, but only for dissimilarity in species richness. Our results also indicated that environmental conditions (e.g. climate conditions) limit the number of alien species invading a given site, but that the presence of dispersal corridors determines the paths of invasion and therefore the pool of species reaching each site. As geographic distances (e.g. ecotones and roads) tend to explain invasion at our regional scale highlights the need to consider the management of alien invasions in the context of integrated landscape planning. Alien species management should include (but not be limited to) the mitigation of dispersal pathways along linear infrastructures. Our results therefore highlight potentially useful applications of the novel multimodel framework to the anticipation and management of plant invasions. (C) 2013 Elsevier GmbH. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organisations in Multi-Agent Systems (MAS) have proven to be successful in regulating agent societies. Nevertheless, changes in agents' behaviour or in the dynamics of the environment may lead to a poor fulfilment of the system's purposes, and so the entire organisation needs to be adapted. In this paper we focus on endowing the organisation with adaptation capabilities, instead of expecting agents to be capable of adapting the organisation by themselves. We regard this organisational adaptation as an assisting service provided by what we call the Assistance Layer. Our generic Two Level Assisted MAS Architecture (2-LAMA) incorporates such a layer. We empirically evaluate this approach by means of an agent-based simulator we have developed for the P2P sharing network domain. This simulator implements 2-LAMA architecture and supports the comparison between different adaptation methods, as well as, with the standard BitTorrent protocol. In particular, we present two alternatives to perform norm adaptation and one method to adapt agents'relationships. The results show improved performance and demonstrate that the cost of introducing an additional layer in charge of the system's adaptation is lower than its benefits.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For people with disabilities, however, housing options have been limited. Today, state and federal laws are changing this. Who will benefit? All of us. For “accessibility” is an issue that, at one time or another, affects us all. This is true whether _ temporarily or permanently _ we use wheelchairs, need grab bars, cannot climb stairs, require easy-to-reach shelves, or rely on easy-to-navigate living spaces. The primary purpose of accessible housing law is to prevent discrimination against people with disabilities, but the end result is a living environment that is more usable for everyone. For example, both the very young and the very old will find an accessible dwelling more comfortable. People with temporary limitations due to injury or illness will find it easier to live in. Such a home will be more welcoming to guests with disabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document presents the results of a state-of-practice survey of transportation agencies that are installing intelligent transportation sensors (ITS) and other devices along with their environmental sensing stations (ESS) also referred to as roadway weather information system (RWIS) assets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityö muodostuu kahdesta kokonaisuudesta. Työn teoriaosa kertoo mitä ympäristöjohtaminen on, millaisia ovat multi-site -organisaatio ja multi-site -johtamisjärjestelmä sekä mitä vaatimuksia nämä asettavat yritykselle. Työssä esitetään malli, jota käyttämällä kansainvälisten johtamisjärjestelmästandardien mukaan rakennetut laatu-, ympäristö-, terveys- ja turvallisuusjärjestelmät voidaan yhdistää yhdeksi kokonaisuudeksi, multi-site - johtamisjärjestelmäksi. Malli rakentuu kolmesta tasosta, joita ovat paikallinen, maakohtainen ja konsernitaso. Esimerkkien avulla kerrotaan miteneri lähtökohdista voidaan näiden tasojen kautta edetä kohti yhtä johtamiskokonaisuutta. Esille tuodaan myös multi-site -johtamisjärjestelmän käyttöönottoa puoltavat ja vastustavat näkökohdat. Työn konkreettinen osa on johtamisjärjestelmämallin paikallisen tason toteuttaminen. Ympäristöjohtamisjärjestelmän rakentaminen standardin EN ISO 14001:2004 vaatimusten mukaiseksi Kvaerner Power Oy:n Suomen toimipaikoille sekä tämän järjestelmän yhdistäminen sertifioituun EN ISO 9001 -standardin mukaiseen laatujärjestelmään. Työssä kerrotaan miten ympäristöjohtamisjärjestelmä on rakennettu ja miten laatu- ja ympäristöjärjestelmät on liitetty yhdeksi kokonaisuudeksi. Työn tuloksena syntyi malli johtamisjärjestelmien yhdistämisestä sekä sertifioitu ympäristöjohtamisjärjestelmä, jonka yhdistäminen laatujärjestelmään toteutettiin tavoitteiden mukaisesti.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tavoitteena oli selvittää mahdollisuuksia käyttää Linux-ympäristöä mekatronisten koneiden reaaliaikaisessa simuloinnissa. Työssä tutkittiin C-kielellä tehdyn reaaliaikaisen simulointimallin ratkaisua Linux-käyttöjärjestelmässä RTLinux-reaaliaikalaajennuksen avulla. Reaaliaikainen simulointi onnistui RTLinuxin avulla tehokkaasti ja mallinnusmenetelmien rajoissa tarkasti. I/O-toimintojen lisäämistä erillisten I/O-korttien avulla ei tarkasteltu tässä työssä. Reaaliaikaista Linuxia ei ole aikaisemmin käytetty mekatronisten koneiden simulointiin. Tämän vuoksi valmiita työkaluja ei ole olemassa. Linux-ympäristö ei näin ollen sovellu kovin hyvin yleiseen koneensuunnitteluun mallintamisen työläyden vuoksi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityö tarkastelee säikeistettyä ohjelmointia rinnakkaisohjelmoinnin ylemmällä hierarkiatasolla tarkastellen erityisesti hypersäikeistysteknologiaa. Työssä tarkastellaan hypersäikeistyksen hyviä ja huonoja puolia sekä sen vaikutuksia rinnakkaisalgoritmeihin. Työn tavoitteena oli ymmärtää Intel Pentium 4 prosessorin hypersäikeistyksen toteutus ja mahdollistaa sen hyödyntäminen, missä se tuo suorituskyvyllistä etua. Työssä kerättiin ja analysoitiin suorituskykytietoa ajamalla suuri joukko suorituskykytestejä eri olosuhteissa (muistin käsittely, kääntäjän asetukset, ympäristömuuttujat...). Työssä tarkasteltiin kahdentyyppisiä algoritmeja: matriisioperaatioita ja lajittelua. Näissä sovelluksissa on säännöllinen muistinkäyttökuvio, mikä on kaksiteräinen miekka. Se on etu aritmeettis-loogisissa prosessoinnissa, mutta toisaalta huonontaa muistin suorituskykyä. Syynä siihen on nykyaikaisten prosessorien erittäin hyvä raaka suorituskyky säännöllistä dataa käsiteltäessä, mutta muistiarkkitehtuuria rajoittaa välimuistien koko ja useat puskurit. Kun ongelman koko ylittää tietyn rajan, todellinen suorituskyky voi pudota murto-osaan huippusuorituskyvystä.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L'imagerie par résonance magnétique (IRM) peut fournir aux cardiologues des informations diagnostiques importantes sur l'état de la maladie de l'artère coronarienne dans les patients. Le défi majeur pour l'IRM cardiaque est de gérer toutes les sources de mouvement qui peuvent affecter la qualité des images en réduisant l'information diagnostique. Cette thèse a donc comme but de développer des nouvelles techniques d'acquisitions des images IRM, en changeant les techniques de compensation du mouvement, pour en augmenter l'efficacité, la flexibilité, la robustesse et pour obtenir plus d'information sur le tissu et plus d'information temporelle. Les techniques proposées favorisent donc l'avancement de l'imagerie des coronaires dans une direction plus maniable et multi-usage qui peut facilement être transférée dans l'environnement clinique. La première partie de la thèse s'est concentrée sur l'étude du mouvement des artères coronariennes sur des patients en utilisant la techniques d'imagerie standard (rayons x), pour mesurer la précision avec laquelle les artères coronariennes retournent dans la même position battement après battement (repositionnement des coronaires). Nous avons découvert qu'il y a des intervalles dans le cycle cardiaque, tôt dans la systole et à moitié de la diastole, où le repositionnement des coronaires est au minimum. En réponse nous avons développé une nouvelle séquence d'acquisition (T2-post) capable d'acquérir les données aussi tôt dans la systole. Cette séquence a été testée sur des volontaires sains et on a pu constater que la qualité de visualisation des artère coronariennes est égale à celle obtenue avec les techniques standard. De plus, le rapport signal sur bruit fourni par la séquence d'acquisition proposée est supérieur à celui obtenu avec les techniques d'imagerie standard. La deuxième partie de la thèse a exploré un paradigme d'acquisition des images cardiaques complètement nouveau pour l'imagerie du coeur entier. La technique proposée dans ce travail acquiert les données sans arrêt (free-running) au lieu d'être synchronisée avec le mouvement cardiaque. De cette façon, l'efficacité de la séquence d'acquisition est augmentée de manière significative et les images produites représentent le coeur entier dans toutes les phases cardiaques (quatre dimensions, 4D). Par ailleurs, l'auto-navigation de la respiration permet d'effectuer cette acquisition en respiration libre. Cette technologie rend possible de visualiser et évaluer l'anatomie du coeur et de ses vaisseaux ainsi que la fonction cardiaque en quatre dimensions et avec une très haute résolution spatiale et temporelle, sans la nécessité d'injecter un moyen de contraste. Le pas essentiel qui a permis le développement de cette technique est l'utilisation d'une trajectoire d'acquisition radiale 3D basée sur l'angle d'or. Avec cette trajectoire, il est possible d'acquérir continûment les données d'espace k, puis de réordonner les données et choisir les paramètres temporel des images 4D a posteriori. L'acquisition 4D a été aussi couplée avec un algorithme de reconstructions itératif (compressed sensing) qui permet d'augmenter la résolution temporelle tout en augmentant la qualité des images. Grâce aux images 4D, il est possible maintenant de visualiser les artères coronariennes entières dans chaque phase du cycle cardiaque et, avec les mêmes données, de visualiser et mesurer la fonction cardiaque. La qualité des artères coronariennes dans les images 4D est la même que dans les images obtenues avec une acquisition 3D standard, acquise en diastole Par ailleurs, les valeurs de fonction cardiaque mesurées au moyen des images 4D concorde avec les valeurs obtenues avec les images 2D standard. Finalement, dans la dernière partie de la thèse une technique d'acquisition a temps d'écho ultra-court (UTE) a été développée pour la visualisation in vivo des calcifications des artères coronariennes. Des études récentes ont démontré que les acquisitions UTE permettent de visualiser les calcifications dans des plaques athérosclérotiques ex vivo. Cepandent le mouvement du coeur a entravé jusqu'à maintenant l'utilisation des techniques UTE in vivo. Pour résoudre ce problème nous avons développé une séquence d'acquisition UTE avec trajectoire radiale 3D et l'avons testée sur des volontaires. La technique proposée utilise une auto-navigation 3D pour corriger le mouvement respiratoire et est synchronisée avec l'ECG. Trois échos sont acquis pour extraire le signal de la calcification avec des composants au T2 très court tout en permettant de séparer le signal de la graisse depuis le signal de l'eau. Les résultats sont encore préliminaires mais on peut affirmer que la technique développé peut potentiellement montrer les calcifications des artères coronariennes in vivo. En conclusion, ce travail de thèse présente trois nouvelles techniques pour l'IRM du coeur entier capables d'améliorer la visualisation et la caractérisation de la maladie athérosclérotique des coronaires. Ces techniques fournissent des informations anatomiques et fonctionnelles en quatre dimensions et des informations sur la composition du tissu auparavant indisponibles. CORONARY artery magnetic resonance imaging (MRI) has the potential to provide the cardiologist with relevant diagnostic information relative to coronary artery disease of patients. The major challenge of cardiac MRI, though, is dealing with all sources of motions that can corrupt the images affecting the diagnostic information provided. The current thesis, thus, focused on the development of new MRI techniques that change the standard approach to cardiac motion compensation in order to increase the efficiency of cardioavscular MRI, to provide more flexibility and robustness, new temporal information and new tissue information. The proposed approaches help in advancing coronary magnetic resonance angiography (MRA) in the direction of an easy-to-use and multipurpose tool that can be translated to the clinical environment. The first part of the thesis focused on the study of coronary artery motion through gold standard imaging techniques (x-ray angiography) in patients, in order to measure the precision with which the coronary arteries assume the same position beat after beat (coronary artery repositioning). We learned that intervals with minimal coronary artery repositioning occur in peak systole and in mid diastole and we responded with a new pulse sequence (T2~post) that is able to provide peak-systolic imaging. Such a sequence was tested in healthy volunteers and, from the image quality comparison, we learned that the proposed approach provides coronary artery visualization and contrast-to-noise ratio (CNR) comparable with the standard acquisition approach, but with increased signal-to-noise ratio (SNR). The second part of the thesis explored a completely new paradigm for whole- heart cardiovascular MRI. The proposed techniques acquires the data continuously (free-running), instead of being triggered, thus increasing the efficiency of the acquisition and providing four dimensional images of the whole heart, while respiratory self navigation allows for the scan to be performed in free breathing. This enabling technology allows for anatomical and functional evaluation in four dimensions, with high spatial and temporal resolution and without the need for contrast agent injection. The enabling step is the use of a golden-angle based 3D radial trajectory, which allows for a continuous sampling of the k-space and a retrospective selection of the timing parameters of the reconstructed dataset. The free-running 4D acquisition was then combined with a compressed sensing reconstruction algorithm that further increases the temporal resolution of the 4D dataset, while at the same time increasing the overall image quality by removing undersampling artifacts. The obtained 4D images provide visualization of the whole coronary artery tree in each phases of the cardiac cycle and, at the same time, allow for the assessment of the cardiac function with a single free- breathing scan. The quality of the coronary arteries provided by the frames of the free-running 4D acquisition is in line with the one obtained with the standard ECG-triggered one, and the cardiac function evaluation matched the one measured with gold-standard stack of 2D cine approaches. Finally, the last part of the thesis focused on the development of ultrashort echo time (UTE) acquisition scheme for in vivo detection of calcification in the coronary arteries. Recent studies showed that UTE imaging allows for the coronary artery plaque calcification ex vivo, since it is able to detect the short T2 components of the calcification. The heart motion, though, prevented this technique from being applied in vivo. An ECG-triggered self-navigated 3D radial triple- echo UTE acquisition has then been developed and tested in healthy volunteers. The proposed sequence combines a 3D self-navigation approach with a 3D radial UTE acquisition enabling data collection during free breathing. Three echoes are simultaneously acquired to extract the short T2 components of the calcification while a water and fat separation technique allows for proper visualization of the coronary arteries. Even though the results are still preliminary, the proposed sequence showed great potential for the in vivo visualization of coronary artery calcification. In conclusion, the thesis presents three novel MRI approaches aimed at improved characterization and assessment of atherosclerotic coronary artery disease. These approaches provide new anatomical and functional information in four dimensions, and support tissue characterization for coronary artery plaques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, massive protostars have turned out to be a possible population of high-energy emitters. Among the best candidates is IRAS 16547-4247, a protostar that presents a powerful outflow with clear signatures of interaction with its environment. This source has been revealed to be a potential high-energy source because it displays non-thermal radio emission of synchrotron origin, which is evidence of relativistic particles. To improve our understanding of IRAS 16547-4247 as a high-energy source, we analyzed XMM-Newton archival data and found that IRAS 16547-4247 is a hard X-ray source. We discuss these results in the context of a refined one-zone model and previous radio observations. From our study we find that it may be difficult to explain the X-ray emission as non-thermal radiation coming from the interaction region, but it might be produced by thermal Bremsstrahlung (plus photo-electric absorption) by a fast shock at the jet end. In the high-energy range, the source might be detectable by the present generation of Cherenkov telescopes, and may eventually be detected by Fermi in the GeV range.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Master’s Thesis examines knowledge creation and transfer processes in an iterative project environment. The aim is to understand how knowledge is created and transferred during an actual iterative implementation project which takes place in International Business Machines (IBM). The second aim is to create and develop new working methods that support more effective knowledge creation and transfer for future iterative implementation projects. The research methodology in this thesis is qualitative. Using focus group interviews as a research method provides qualitative information and introduces the experiences of the individuals participating in the project. This study found that the following factors affect knowledge creation and transfer in an iterative, multinational, and multi-organizational implementation project: shared vision and common goal, trust, open communication, social capital, and network density. All of these received both theoretical and empirical support. As for future projects, strengthening these factors was found to be the key for more effective knowledge creation and transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the existence of free software and pedagogical guides, the use of Data Envelopment Analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run their own efficiency analysis. Within DEA, several alternative models allow for an environmental adjustment. Four alternative models, each user-friendly and easily accessible to practitioners and decision makers, are performed using empirical data of 90 primary schools in the State of Geneva, Switzerland. Results show that the majority of alternative models deliver divergent results. From a political and a managerial standpoint, these diverging results could lead to potentially ineffective decisions. As no consensus emerges on the best model to use, practitioners and decision makers may be tempted to select the model that is right for them, in other words, the model that best reflects their own preferences. Further studies should investigate how an appropriate multi-criteria decision analysis method could help decision makers to select the right model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given. The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given.