946 resultados para Towards Seamless Integration of Geoscience Models and Data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simplified CFD wake model based on the actuator disk concept is used to simulate the wind turbine, represented by a disk upon which a distribution of forces, defined as axial momentum sources, are applied on the incoming non-uniform flow. The rotor is supposed to be uniformly loaded, with the exerted forces function of the incident wind speed, the thrust coefficient and the rotor diameter. The model is tested under different parameterizations of turbulence models and validated through experimental measurements downwind of a wind turbine in terms of wind speed deficit and turbulence intensity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Survey Engineering curricula involves the integration of many formal disciplines at a high level of proficiency. The Escuela de Ingenieros en Topografía, Cartografía y Geodesia at Universidad Politécnica de Madrid (Survey Engineering) has developed an intense and deep teaching on so-called Applied Land Sciences and Technologies or Land Engineering. However, new approaches are encouraged by the European Higher Education Area (EHEA). This fact requires a review of traditional teaching and methods. Furthermore, the new globalization and international approach gives new ways to this discipline to teach and learn about how to bridge gap between cultures and regions. This work is based in two main needs. On one hand, it is based on integration of basic knowledge and disciplines involved in typical Survey Engineering within Land Management. On the other, there is an urgent need to consider territory on a social and ethical basis, as far as a part of the society, culture, idiosyncrasy or economy. The integration of appropriate knowledge of the Land Management is typically dominated by civil engineers and urban planners. It would be very possible to integrate Survey Engineering and Cooperation for Development in the framework of Land Management disciplines. Cooperation for Development is a concept that has changed since beginning of its use until now. Development projects leave an impact on society in response to their beneficiaries and are directed towards self-sustainability. Furthermore, it is the true bridge to reduce gap between societies when differences are immeasurable. The concept of development has also been changing and nowadays it is not a purely economic concept. Education, science and technology are increasingly taking a larger role in what is meant by development. Moreover, it is commonly accepted that Universities should transfer knowledge to society, and the transfer of knowledge should be open to countries most in need for developing. If the importance of the country development is given by education, science and technology, knowledge transfer would be one of the most clear of ways of Cooperation for Development. Therefore, university cooperation is one of the most powerful tools to achieve it, placing universities as agents of development. In Spain, the role of universities as agents of development and cooperation has been largely strengthened. All about this work deals to how to implement both Cooperation for Development and Land Management within Survey Engineering at the EHEA framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La investigación para el conocimiento del cerebro es una ciencia joven, su inicio se remonta a Santiago Ramón y Cajal en 1888. Desde esta fecha a nuestro tiempo la neurociencia ha avanzado mucho en el desarrollo de técnicas que permiten su estudio. Desde la neurociencia cognitiva hoy se explican muchos modelos que nos permiten acercar a nuestro entendimiento a capacidades cognitivas complejas. Aun así hablamos de una ciencia casi en pañales que tiene un lago recorrido por delante. Una de las claves del éxito en los estudios de la función cerebral ha sido convertirse en una disciplina que combina conocimientos de diversas áreas: de la física, de las matemáticas, de la estadística y de la psicología. Esta es la razón por la que a lo largo de este trabajo se entremezclan conceptos de diferentes campos con el objetivo de avanzar en el conocimiento de un tema tan complejo como el que nos ocupa: el entendimiento de la mente humana. Concretamente, esta tesis ha estado dirigida a la integración multimodal de la magnetoencefalografía (MEG) y la resonancia magnética ponderada en difusión (dMRI). Estas técnicas son sensibles, respectivamente, a los campos magnéticos emitidos por las corrientes neuronales, y a la microestructura de la materia blanca cerebral. A lo largo de este trabajo hemos visto que la combinación de estas técnicas permiten descubrir sinergias estructurofuncionales en el procesamiento de la información en el cerebro sano y en el curso de patologías neurológicas. Más específicamente en este trabajo se ha estudiado la relación entre la conectividad funcional y estructural y en cómo fusionarlas. Para ello, se ha cuantificado la conectividad funcional mediante el estudio de la sincronización de fase o la correlación de amplitudes entre series temporales, de esta forma se ha conseguido un índice que mide la similitud entre grupos neuronales o regiones cerebrales. Adicionalmente, la cuantificación de la conectividad estructural a partir de imágenes de resonancia magnética ponderadas en difusión, ha permitido hallar índices de la integridad de materia blanca o de la fuerza de las conexiones estructurales entre regiones. Estas medidas fueron combinadas en los capítulos 3, 4 y 5 de este trabajo siguiendo tres aproximaciones que iban desde el nivel más bajo al más alto de integración. Finalmente se utilizó la información fusionada de MEG y dMRI para la caracterización de grupos de sujetos con deterioro cognitivo leve, la detección de esta patología resulta relevante en la identificación precoz de la enfermedad de Alzheimer. Esta tesis está dividida en seis capítulos. En el capítulos 1 se establece un contexto para la introducción de la connectómica dentro de los campos de la neuroimagen y la neurociencia. Posteriormente en este capítulo se describen los objetivos de la tesis, y los objetivos específicos de cada una de las publicaciones científicas que resultaron de este trabajo. En el capítulo 2 se describen los métodos para cada técnica que fue empleada: conectividad estructural, conectividad funcional en resting state, redes cerebrales complejas y teoría de grafos y finalmente se describe la condición de deterioro cognitivo leve y el estado actual en la búsqueda de nuevos biomarcadores diagnósticos. En los capítulos 3, 4 y 5 se han incluido los artículos científicos que fueron producidos a lo largo de esta tesis. Estos han sido incluidos en el formato de la revista en que fueron publicados, estando divididos en introducción, materiales y métodos, resultados y discusión. Todos los métodos que fueron empleados en los artículos están descritos en el capítulo 2 de la tesis. Finalmente, en el capítulo 6 se concluyen los resultados generales de la tesis y se discuten de forma específica los resultados de cada artículo. ABSTRACT In this thesis I apply concepts from mathematics, physics and statistics to the neurosciences. This field benefits from the collaborative work of multidisciplinary teams where physicians, psychologists, engineers and other specialists fight for a common well: the understanding of the brain. Research on this field is still in its early years, being its birth attributed to the neuronal theory of Santiago Ramo´n y Cajal in 1888. In more than one hundred years only a very little percentage of the brain functioning has been discovered, and still much more needs to be explored. Isolated techniques aim at unraveling the system that supports our cognition, nevertheless in order to provide solid evidence in such a field multimodal techniques have arisen, with them we will be able to improve current knowledge about human cognition. Here we focus on the multimodal integration of magnetoencephalography (MEG) and diffusion weighted magnetic resonance imaging. These techniques are sensitive to the magnetic fields emitted by the neuronal currents and to the white matter microstructure, respectively. The combination of such techniques could bring up evidences about structural-functional synergies in the brain information processing and which part of this synergy fails in specific neurological pathologies. In particular, we are interested in the relationship between functional and structural connectivity, and how two integrate this information. We quantify the functional connectivity by studying the phase synchronization or the amplitude correlation between time series obtained by MEG, and so we get an index indicating similarity between neuronal entities, i.e. brain regions. In addition we quantify structural connectivity by performing diffusion tensor estimation from the diffusion weighted images, thus obtaining an indicator of the integrity of the white matter or, if preferred, the strength of the structural connections between regions. These quantifications are then combined following three different approaches, from the lowest to the highest level of integration, in chapters 3, 4 and 5. We finally apply the fused information to the characterization or prediction of mild cognitive impairment, a clinical entity which is considered as an early step in the continuum pathological process of dementia. The dissertation is divided in six chapters. In chapter 1 I introduce connectomics within the fields of neuroimaging and neuroscience. Later in this chapter we describe the objectives of this thesis, and the specific objectives of each of the scientific publications that were produced as result of this work. In chapter 2 I describe the methods for each of the techniques that were employed, namely structural connectivity, resting state functional connectivity, complex brain networks and graph theory, and finally, I describe the clinical condition of mild cognitive impairment and the current state of the art in the search for early biomarkers. In chapters 3, 4 and 5 I have included the scientific publications that were generated along this work. They have been included in in their original format and they contain introduction, materials and methods, results and discussion. All methods that were employed in these papers have been described in chapter 2. Finally, in chapter 6 I summarize all the results from this thesis, both locally for each of the scientific publications and globally for the whole work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss linear Ricardo models with a range of parameters. We show that the exact boundary of the region of equilibria of these models is obtained by solving a simple integer programming problem. We show that there is also an exact correspondence between many of the equilibria resulting from families of linear models and the multiple equilibria of economies of scale models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The U.S. railroad companies spend billions of dollars every year on railroad track maintenance in order to ensure safety and operational efficiency of their railroad networks. Besides maintenance costs, other costs such as train accident costs, train and shipment delay costs and rolling stock maintenance costs are also closely related to track maintenance activities. Optimizing the track maintenance process on the extensive railroad networks is a very complex problem with major cost implications. Currently, the decision making process for track maintenance planning is largely manual and primarily relies on the knowledge and judgment of experts. There is considerable potential to improve the process by using operations research techniques to develop solutions to the optimization problems on track maintenance. In this dissertation study, we propose a range of mathematical models and solution algorithms for three network-level scheduling problems on track maintenance: track inspection scheduling problem (TISP), production team scheduling problem (PTSP) and job-to-project clustering problem (JTPCP). TISP involves a set of inspection teams which travel over the railroad network to identify track defects. It is a large-scale routing and scheduling problem where thousands of tasks are to be scheduled subject to many difficult side constraints such as periodicity constraints and discrete working time constraints. A vehicle routing problem formulation was proposed for TISP, and a customized heuristic algorithm was developed to solve the model. The algorithm iteratively applies a constructive heuristic and a local search algorithm in an incremental scheduling horizon framework. The proposed model and algorithm have been adopted by a Class I railroad in its decision making process. Real-world case studies show the proposed approach outperforms the manual approach in short-term scheduling and can be used to conduct long-term what-if analyses to yield managerial insights. PTSP schedules capital track maintenance projects, which are the largest track maintenance activities and account for the majority of railroad capital spending. A time-space network model was proposed to formulate PTSP. More than ten types of side constraints were considered in the model, including very complex constraints such as mutual exclusion constraints and consecution constraints. A multiple neighborhood search algorithm, including a decomposition and restriction search and a block-interchange search, was developed to solve the model. Various performance enhancement techniques, such as data reduction, augmented cost function and subproblem prioritization, were developed to improve the algorithm. The proposed approach has been adopted by a Class I railroad for two years. Our numerical results show the model solutions are able to satisfy all hard constraints and most soft constraints. Compared with the existing manual procedure, the proposed approach is able to bring significant cost savings and operational efficiency improvement. JTPCP is an intermediate problem between TISP and PTSP. It focuses on clustering thousands of capital track maintenance jobs (based on the defects identified in track inspection) into projects so that the projects can be scheduled in PTSP. A vehicle routing problem based model and a multiple-step heuristic algorithm were developed to solve this problem. Various side constraints such as mutual exclusion constraints and rounding constraints were considered. The proposed approach has been applied in practice and has shown good performance in both solution quality and efficiency.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hadrontherapy employs high-energy beams of charged particles (protons and heavier ions) to treat deep-seated tumours: these particles have a favourable depth-dose distribution in tissue characterized by a low dose in the entrance channel and a sharp maximum (Bragg peak) near the end of their path. In these treatments nuclear interactions have to be considered: beam particles can fragment in the human body releasing a non-zero dose beyond the Bragg peak while fragments of human body nuclei can modify the dose released in healthy tissues. These effects are still in question given the lack of interesting cross sections data. Also space radioprotection can profit by fragmentation cross section measurements: the interest in long-term manned space missions beyond Low Earth Orbit is growing in these years but it has to cope with major health risks due to space radiation. To this end, risk models are under study: however, huge gaps in fragmentation cross sections data are currently present preventing an accurate benchmark of deterministic and Monte Carlo codes. To fill these gaps in data, the FOOT (FragmentatiOn Of Target) experiment was proposed. It is composed by two independent and complementary setups, an Emulsion Cloud Chamber and an electronic setup composed by several subdetectors providing redundant measurements of kinematic properties of fragments produced in nuclear interactions between a beam and a target. FOOT aims to measure double differential cross sections both in angle and kinetic energy which is the most complete information to address existing questions. In this Ph.D. thesis, the development of the Trigger and Data Acquisition system for the FOOT electronic setup and a first analysis of 400 MeV/u 16O beam on Carbon target data acquired in July 2021 at GSI (Darmstadt, Germany) are presented. When possible, a comparison with other available measurements is also reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A network can be analyzed at different topological scales, ranging from single nodes to motifs, communities, up to the complete structure. We propose a novel approach which extends from single nodes to the whole network level by considering non-overlapping subgraphs (i.e. connected components) and their interrelationships and distribution through the network. Though such subgraphs can be completely general, our methodology focuses on the cases in which the nodes of these subgraphs share some special feature, such as being critical for the proper operation of the network. The methodology of subgraph characterization involves two main aspects: (i) the generation of histograms of subgraph sizes and distances between subgraphs and (ii) a merging algorithm, developed to assess the relevance of nodes outside subgraphs by progressively merging subgraphs until the whole network is covered. The latter procedure complements the histograms by taking into account the nodes lying between subgraphs, as well as the relevance of these nodes to the overall subgraph interconnectivity. Experiments were carried out using four types of network models and five instances of real-world networks, in order to illustrate how subgraph characterization can help complementing complex network-based studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For the first time, a glassy carbon electrode (GCE) modified with novel N-doped carbon nanotubes (CNT-N) functionalized with MnFe2O4 nanoparticles (MnFe2O4@CNT-N) has been prepared and applied for the electrochemical determination of caffeine (CF), acetaminophen (AC) and ascorbic acid (AA). The electrochemical behaviour of CF, AC and AA on the bare GCE, CNT-N/GCE and MnFe2O4@CNT-N/GCE were carefully investigated using cyclic voltammetry (CV) and square-wave voltammetry (SWV). Compared to bare GCE and CNT-N modified electrode, the MnFe2O4@CNT-N modified electrode can remarkably improve the electrocatalytic activity towards the oxidation of CF, AC and AA with an increase in the anodic peak currents of 52%, 50% and 55%, respectively. Also, the SWV anodic peaks of these molecules could be distinguished from each other at the MnFe2O4@CNT-N modified electrode with enhanced oxidation currents. The linear response ranges for the square wave voltammetric determination of CF, AC and AA were 1.0 × 10−6 to 1.1 × 10−3 mol dm−3, 1.0 × 10−6 to 1.0 × 10−3 mol dm−3 and 2.0 × 10−6 to 1.0 × 10−4 mol dm−3 with detection limit (S/N = 3) of 0.83 × 10−6, 0.83 × 10−6 and 1.8 × 10−6 mol dm−3, respectively. The sensitivity values at the MnFe2O4@CNT-N/GCE for the individual determination of AC, AA and CF and in the presence of the other molecules showed that the quantification of AA and CF show no interferences from the other molecules; however, AA and CF interfered in the determination of AC, with the latter molecule showing the strongest interference. Nevertheless, the obtained results show that MnFe2O4@CNT-N composite material acted as an efficient electrochemical sensor towards the selected biomolecules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Approximate Quickselect, a simple modification of the well known Quickselect algorithm for selection, can be used to efficiently find an element with rank k in a given range [i..j], out of n given elements. We study basic cost measures of Approximate Quickselect by computing exact and asymptotic results for the expected number of passes, comparisons and data moves during the execution of this algorithm. The key element appearing in the analysis of Approximate Quickselect is a trivariate recurrence that we solve in full generality. The general solution of the recurrence proves to be very useful, as it allows us to tackle several related problems, besides the analysis that originally motivated us. In particular, we have been able to carry out a precise analysis of the expected number of moves of the ith element when selecting the jth smallest element with standard Quickselect, where we are able to give both exact and asymptotic results. Moreover, we can apply our general results to obtain exact and asymptotic results for several parameters in binary search trees, namely the expected number of common ancestors of the nodes with rank i and j, the expected size of the subtree rooted at the least common ancestor of the nodes with rank i and j, and the expected distance between the nodes of ranks i and j.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Carnitine is a key molecule in energy metabolism that helps transport activated fatty acids into the mitochondria. Its homeostasis is achieved through oral intake, renal reabsorption and de novo biosynthesis. Unlike dietary intake and renal reabsorption, the importance of de novo biosynthesis pathway in carnitine homeostasis remains unclear, due to lack of animal models and description of a single patient defective in this pathway. CASE PRESENTATION: We identified by array comparative genomic hybridization a 42 months-old girl homozygote for a 221 Kb interstitial deletions at 11p14.2, that overlaps the genes encoding Fibin and butyrobetaine-gamma 2-oxoglutarate dioxygenase 1 (BBOX1), an enzyme essential for the biosynthesis of carnitine de novo. She presented microcephaly, speech delay, growth retardation and minor facial anomalies. The levels of almost all evaluated metabolites were normal. Her serum level of free carnitine was at the lower limit of the reference range, while her acylcarnitine to free carnitine ratio was normal. CONCLUSIONS: We present an individual with a completely defective carnitine de novo biosynthesis. This condition results in mildly decreased free carnitine level, but not in clinical manifestations characteristic of carnitine deficiency disorders, suggesting that dietary carnitine intake and renal reabsorption are sufficient to carnitine homeostasis. Our results also demonstrate that haploinsufficiency of BBOX1 and/or Fibin is not associated with Primrose syndrome as previously suggested.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Life sciences are yielding huge data sets that underpin scientific discoveries fundamental to improvement in human health, agriculture and the environment. In support of these discoveries, a plethora of databases and tools are deployed, in technically complex and diverse implementations, across a spectrum of scientific disciplines. The corpus of documentation of these resources is fragmented across the Web, with much redundancy, and has lacked a common standard of information. The outcome is that scientists must often struggle to find, understand, compare and use the best resources for the task at hand.Here we present a community-driven curation effort, supported by ELIXIR-the European infrastructure for biological information-that aspires to a comprehensive and consistent registry of information about bioinformatics resources. The sustainable upkeep of this Tools and Data Services Registry is assured by a curation effort driven by and tailored to local needs, and shared amongst a network of engaged partners.As of November 2015, the registry includes 1785 resources, with depositions from 126 individual registrations including 52 institutional providers and 74 individuals. With community support, the registry can become a standard for dissemination of information about bioinformatics resources: we welcome everyone to join us in this common endeavour. The registry is freely available at https://bio.tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The object of analysis in the present text is the issue of operational control and data retention in Poland. The analysis of this issue follows from a critical stance taken by NGOs and state institutions on the scope of operational control wielded by the Polish police and special services – it concerns, in particular, the employment of “itemized phone bills and the so-called phone tapping.” Besides the quantitative analysis of operational control and the scope of data retention, the text features the conclusions of the Human Rights Defender referred to the Constitutional Tribunal in 2011. It must be noted that the main problems concerned with the employment of operational control and data retention are caused by: (1) a lack of specification of technical means which can be used by individual services; (2) a lack of specification of what kind of information and evidence is in question; (3) an open catalogue of information and evidence which can be clandestinely acquired in an operational mode. Furthermore, with regard to the access granted to teleinformation data by the Telecommunications Act, attention should be drawn to a wide array of data submitted to particular services. Also, the text draws on the so-called open interviews conducted mainly with former police officers with a view to pointing to some non-formal reasons for “phone tapping” in Poland. This comes in the form of a summary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Indian author Rabindranath Tagore was received like royalty during his visits to the West after winning the Nobel Prize in 1913. Dreams of foreign cultures offered a retreat from a complicated age. In a time when the West appeared to be living under threat of disintegration and when industrialism seemed like a cul-de-sac, he appeared to offer the promise of a return to a lost paradise, a spiritual abode that is superior to the restless Western culture. However, Tagore’s popularity faded rapidly, most notably in England, the main target of his criticism. Soon after Tagore had won the Nobel Prize, the English became indignant at Tagore’s anti-colonial attitude.Tagore visited Sweden in 1921 and 1926 and was given a warm reception. His visits to Sweden can be seen as an episode in a longer chain of events. It brought to life old conceptions of India as the abode of spirituality on earth. Nevertheless, interest in him was a relatively short-lived phenomenon in Sweden. Only a few of his admirers in Sweden appreciated the complexity of Tagore’s achievements. His “anathema of mammonism”, as a Swedish newspaper called it, was not properly received. After a steady stream of translations his popularity flagged towards the end of the 1920s and then almost disappeared entirely. Tagores visits in Sweden gave an indication that India was on the way to liberate itself from its colonial legacy, which consequently contributed to the waning of his popularity in the West. In the long run, his criticism of the drawbacks in the western world became too obvious to maintain permanent interest. The Russian author Fyodor Dostoyevskiy’s Crime and Punishment (1866) has enticed numerous interpretations such as the purely biographical approach. In the nervous main character of the novel, the young student Raskolnikov, one easily recognizes Dostoyevskiy himself. The novel can also be seen as a masterpiece of realistic fiction. It gives a broad picture of Saint Petersburg, a metropolis in decay. Crime and Punishment can also be seen as one of the first examples of a modern psychological novel, since it is focused on the inner drama of its main character, the young student Raskolnikov. His actions seem to be governed by mere coincidences, dreams and the spur of the moment. it seems fruitful to study the novel from a psychoanalytical approach. In his book Raskolnikov: the way of the divided towards unity in Crime and Punishment (1982), a Swedish scholar, Owe Wikström, has followed this line of interpretation all the way to Freud’s disciple C G Jung. In addition to this, the novel functions as an exciting crime story. To a large extent it is Viktor Sjklovskij and other Russian formalists from the 1920s and onwards who have taught the western audience to understand the specific nature of the crime story. The novel could be seen as a story about religious conversion. Like Lasarus in the Bible (whose story attracts a lot of attention in the novel) Raskolnikov is awakened from the dead, and together with Sonja he starts a completely new life. The theme of conversion has a special meaning for Dostoyevskiy. For him the conversion meant an acknowledgement of the specific nature of Russia itself. Crime and punishment mirrors the conflict between traditional Russian values and western influences that has been obvious in Russia throughout the history of the country. The novel reflects a dialogue that still continues in Russian society. The Russian literary historian Mikhail Bakhtin, who is probably the most famous interpreter of the works of Dostoyevskiy, has become famous precisely by emphasizing the importance of dialogues in novels like Crime and Punishment. According to Bakhtin, this novel is characterized by its multitude of voices. Various ideas are confronted with each other, and each one of them is personified by one of the characters in the novel. The author has resigned from his position as the superior monitor of the text, and he leaves it to the reader to decide what interpretation is the correct one..The aim of the present study is thus to analyze the complex reactions in the west to Tagore’s visits in Sweden and to Fyodor Dostoyevskiys novel Crime and Punishment.. This leads to more general conclusions on communication between cultures.