23 resultados para RESEARCH SCIENTIFIC

em Universidad Politécnica de Madrid


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Technofusion is the scientific&technical installation for fusion research in Spain, based on three pillars: • It is an open facility to European users. • It is a facility with instrumentation not accesible to small research groups. • It is designed to be closely coordiated with the European Fusion Program. With a budget of 80-100 M€ over five years, several top laboratories will be constructed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows the development of a science-technological knowledge transfer model in Mexico, as a means to boost the limited relations between the scientific and industrial environments. The proposal is based on the analysis of eight organizations (research centers and firms) with varying degrees of skill in the practice of science-technological knowledge transfer, and carried out by the case study approach. The analysis highlights the synergistic use of the organizational and technological capabilities of each organization, as a means to identification of the knowledge transfer mechanisms best suited to enabling the establishment of cooperative processes, and achieve the R&D and innovation activities results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Bioinstrumentation Laboratory belongs to the Centre for Biomedical Technology (CTB) of the Technical University of Madrid and its main objective is to provide the scientific community with devices and techniques for the characterization of micro and nanostructures and consequently finding their best biomedical applications. Hyperthermia (greek word for “overheating”) is defined as the phenomenon that occurs when a body is exposed to an energy generating source that can produce a rise in temperature (42-45ºC) for a given time [1]. Specifically, the aim of the hyperthermia methods used in The Bioinstrumentation Laboratory is the development of thermal therapies, some of these using different kinds of nanoparticles, to kill cancer cells and reduce the damage on healthy tissues. The optical hyperthermia is based on noble metal nanoparticles and laser irradiation. This kind of nanoparticles has an immense potential associated to the development of therapies for cancer on account of their Surface Plasmon Resonance (SPR) enhanced light scattering and absorption. In a short period of time, the absorbed light is converted into localized heat, so we can take advantage of these characteristics to heat up tumor cells in order to obtain the cellular death [2]. In this case, the laboratory has an optical hyperthermia device based on a continuous wave laser used to kill glioblastoma cell lines (1321N1) in the presence of gold nanorods (Figure 1a). The wavelength of the laser light is 808 nm because the penetration of the light in the tissue is deeper in the Near Infrared Region. The first optical hyperthermia results show that the laser irradiation produces cellular death in the experimental samples of glioblastoma cell lines using gold nanorods but is not able to decrease the cellular viability of cancer cells in samples without the suitable nanorods (Figure 1b) [3]. The generation of magnetic hyperthermia is performed through changes of the magnetic induction in magnetic nanoparticles (MNPs) that are embedded in viscous medium. The Figure 2 shows a schematic design of the AC induction hyperthermia device in magnetic fluids. The equipment has been manufactured at The Bioinstrumentation Laboratory. The first block implies two steps: the signal selection with frequency manipulation option from 9 KHz to 2MHz, and a linear output up to 1500W. The second block is where magnetic field is generated ( 5mm, 10 turns). Finally, the third block is a software control where the user can establish initial parameters, and also shows the temperature response of MNPs due to the magnetic field applied [4-8]. The Bioinstrumentation Laboratory in collaboration with the Mexican company MRI-DT have recently implemented a new research line on Nuclear Magnetic Resonance Hyperthermia, which is sustained on the patent US 7,423,429B2 owned by this company. This investigation is based on the use of clinical MRI equipment not only for diagnosis but for therapy [9]. This idea consists of two main facts: Magnetic Resonance Imaging can cause focal heating [10], and the differentiation in resonant frequency between healthy and cancer cells [11]. To produce only heating in cancer cells when the whole body is irradiated, it is necessary to determine the specific resonant frequency of the target, using the information contained in the spectra of the area of interest. Then, special RF pulse sequence is applied to produce fast excitation and relaxation mechanism that generates temperature increase of the tumor, causing cellular death or metabolism malfunction that stops cellular division

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At the present time almost all map libraries on the Internet are image collections generated by the digitization of early maps. This type of graphics files provides researchers with the possibility of accessing and visualizing historical cartographic information keeping in mind that this information has a degree of quality that depends upon elements such as the accuracy of the digitization process and proprietary constraints (e.g. visualization, resolution downloading options, copyright, use constraints). In most cases, access to these map libraries is useful only as a first approach and it is not possible to use those maps for scientific work due to the sparse tools available to measure, match, analyze and/or combine those resources with different kinds of cartography. This paper presents a method to enrich virtual map rooms and provide historians and other professional with a tool that let them to make the most of libraries in the digital era.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Kuhnian approach to research assessment requires us to consider that the important scientific breakthroughs that drive scientific progress are infrequent and that the progress of science does not depend on normal research. Consequently, indicators of research performance based on the total number of papers do not accurately measure scientific progress. Similarly, those universities with the best reputations in terms of scientific progress differ widely from other universities in terms of the scale of investments made in research and in the higher concentrations of outstanding scientists present, but less so in terms of the total number of papers or citations. This study argues that indicators for the 1% high-citation tail of the citation distribution reveal the contribution of universities to the progress of science and provide quantifiable justification for the large investments in research made by elite research universities. In this tail, which follows a power low, the number of the less frequent and highly cited important breakthroughs can be predicted from the frequencies of papers in the upper part of the tail. This study quantifies the false impression of excellence produced by multinational papers, and by other types of papers that do not contribute to the progress of science. Many of these papers are concentrated in and dominate lists of highly cited papers, especially in lower-ranked universities. The h-index obscures the differences between higher- and lower-ranked universities because the proportion of h-core papers in the 1% high-citation tail is not proportional to the value of the h-index.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of scientific knowledge about possible climate change impacts on water resources has a direct implication on the way water policies are being implemented and evolving. This is particularly true regarding various technical steps embedded into the EU Water Framework Directive river basin management planning, such as risk characterisation, monitoring, design and implementation of action programmes and evaluation of the "good status" objective achievements (in 2015). The need to incorporate climate change considerations into the implementation of EU water policy is currently discussed with a wide range of experts and stakeholders at EU level. Research trends are also on-going, striving to support policy developments and examining how scientific findings and recommendations could be best taken on board by policy-makers and water managers within the forthcoming years. This paper provides a snapshot of policy discussions about climate change in the context of the WFD river basin management planning and specific advancements of related EU-funded research projects. Perspectives for strengthening links among the scientific and policy-making communities in this area are also highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When we look at the history of electricity and electromagnetism in Spain we discover that the most important Spanish researchers are generally out of the official institutions or stable research groups until the 20th century [1] [2]. In the 20th century most of the scientific research is done in stable research institutions and universities and the most important electromagnetism research centres in Spain are located in the Faculty of Physics of the most important universities, the National Scientific Research Council (CSIC) and the School for Telecommunication Engineering created in 1923. But the greatest impulse of research in the antenna and radiowave propagation field is done after 1960 reaching the first national URSI conference in 1980. After that year, the relation between groups and the number of research groups is continuously growing and the relation to industry is also increasing. When Spain joins the European research organizations (COST, ERC...) and the European Union in 1985 the research support experience a fast growing and the participation in the European research structures. In the antenna design field, there exist some specializations although most of the groups have dome specific projects in almost all the antenna analysis and design fields. Here, we have selected the most important and characteristic area related to each of the research groups and institutions. The easiest way to classify the research work in antennas is the selection between antenna analysis, design and measurement. After that the selected frequency bands technology, the type of antennas and the related circuits can be a good criterion to describe the variety of research work and specialization between groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the domain of eScience, investigations are increasingly collaborative. Most scientific and engineering domains benefit from building on top of the outputs of other research: By sharing information to reason over and data to incorporate in the modelling task at hand. This raises the need to provide means for preserving and sharing entire eScience workflows and processes for later reuse. It is required to define which information is to be collected, create means to preserve it and approaches to enable and validate the re-execution of a preserved process. This includes and goes beyond preserving the data used in the experiments, as the process underlying its creation and use is essential. This tutorial thus provides an introduction to the problem domain and discusses solutions for the curation of eScience processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanotechnology represents an area of particular promise and significant opportunity across multiple scientific disciplines. Ongoing nanotechnology research ranges from the characterization of nanoparticles and nanomaterials to the analysis and processing of experimental data seeking correlations between nanoparticles and their functionalities and side effects. Due to their special properties, nanoparticles are suitable for cellular-level diagnostics and therapy, offering numerous applications in medicine, e.g. development of biomedical devices, tissue repair, drug delivery systems and biosensors. In nanomedicine, recent studies are producing large amounts of structural and property data, highlighting the role for computational approaches in information management. While in vitro and in vivo assays are expensive, the cost of computing is falling. Furthermore, improvements in the accuracy of computational methods (e.g. data mining, knowledge discovery, modeling and simulation) have enabled effective tools to automate the extraction, management and storage of these vast data volumes. Since this information is widely distributed, one major issue is how to locate and access data where it resides (which also poses data-sharing limitations). The novel discipline of nanoinformatics addresses the information challenges related to nanotechnology research. In this paper, we summarize the needs and challenges in the field and present an overview of extant initiatives and efforts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New digital artifacts are emerging in data-intensive science. For example, scientific workflows are executable descriptions of scientific procedures that define the sequence of computational steps in an automated data analysis, supporting reproducible research and the sharing and replication of best-practice and know-how through reuse. Workflows are specified at design time and interpreted through their execution in a variety of situations, environments, and domains. Hence it is essential to preserve both their static and dynamic aspects, along with the research context in which they are used. To achieve this, we propose the use of multidimensional digital objects (Research Objects) that aggregate the resources used and/or produced in scientific investigations, including workflow models, provenance of their executions, and links to the relevant associated resources, along with the provision of technological support for their preservation and efficient retrieval and reuse. In this direction, we specified a software architecture for the design and implementation of a Research Object preservation system, and realized this architecture with a set of services and clients, drawing together practices in digital libraries, preservation systems, workflow management, social networking and Semantic Web technologies. In this paper, we describe the backbone system of this realization, a digital library system built on top of dLibra.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on the assessment of the effects of conservation/restoration treatments on stone material has been significant in recent years, with focus on the early observation of decay caused by the application of these treatments. However, in the case of archaeological sites, research is still scarce and few studies on the subject have been published. Restoration, as everything else, has changed according to trends, mainly guided by the release of new products and technologies, an experimental field where scientific assessment of suitability, efficacy and durability pre-evaluations of treatments are not always conducted. Some efforts have been made to solve this problem in the architectural field, where functional needs and technical requirements force to set clear standards. Unfortunately, archaeological sites, unlike historic buildings, have specific features that preclude the extrapolation of these results. A critical review of the methodologies, products and restoration materials is necessary, coupled with deeper research on degradation mechanisms caused by these treatments in the mid- and long-term. The aim of this paper is to introduce the research on the above issues using Merida as a case study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a task-oriented approach to telemanipulation for maintenance in large scientific facilities, with specific focus on the particle accelerator facilities at European Organization for Nuclear Research (CERN) in Geneva, Switzerland and GSI Helmholtz Centre for Heavy Ion Research (GSI) in Darmstadt, Germany. It examines how telemanipulation can be used in these facilities and reviews how this differs from the representation of telemanipulation tasks within the literature. It provides methods to assess and compare telemanipulation procedures as well a test suite to compare telemanipulators themselves from a dexterity perspective. It presents a formalisation of telemanipulation procedures into a hierarchical model which can be then used as a basis to aid maintenance engineers in assessing tasks for telemanipulation, and as the basis for future research. The model introduces a new concept of Elemental Actions as the building block of telemanipulation movements and incorporates the dependent factors for procedures at a higher level of abstraction. In order to gain insight into realistic tasks performed by telemanipulation systems within both industrial and research environments a survey of teleoperation experts is presented. Analysis of the responses is performed from which it is concluded that there is a need within the robotics community for physical benchmarking tests which are geared towards evaluating the dexterity of telemanipulators for comparison of their dexterous abilities. A three stage test suite is presented which is designed to allow maintenance engineers to assess different telemanipulators for their dexterity. This incorporates general characteristics of the system, a method to compare kinematic reachability of multiple telemanipulators and physical test setups to assess dexterity from a both a qualitative perspective and measurably by using performance metrics. Finally, experimental results are provided for the application of the proposed test suite onto two telemanipulation systems, one from a research setting and the other within CERN. It describes the procedure performed and discusses comparisons between the two systems, as well as providing input from the expert operator of the CERN system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a corpus-based analysis of the humanizing metaphor and supports that constitutive metaphor in science and technology may be highly metaphorical and active. The study, grounded in Lakoff’s Theory of Metaphor and in Langacker’s relational networks, consists of two phases: firstly, Earth Science metaphorical terms were extracted from databases and dictionaries and, then, contextualized by means of the “Wordsmith” tool in a digitalized corpus created to establish their productivity. Secondly, the terms were classified to disclose the main conceptual metaphors underlying them; then, the mappings and the relational networks of the metaphor were described. Results confirm the systematicity and productivity of the metaphor in this field, show evidence that metaphoricity of scientific terms is gradable, and support that Earth Science metaphors are not only created in terms of their concrete salient properties and attributes, but also on abstract human anthropocentric projections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbon (C) and nitrogen (N) process-based models are important tools for estimating and reporting greenhouse gas emissions and changes in soil C stocks. There is a need for continuous evaluation, development and adaptation of these models to improve scientific understanding, national inventories and assessment of mitigation options across the world. To date, much of the information needed to describe different processes like transpiration, photosynthesis, plant growth and maintenance, above and below ground carbon dynamics, decomposition and nitrogen mineralization. In ecosystem models remains inaccessible to the wider community, being stored within model computer source code, or held internally by modelling teams. Here we describe the Global Research Alliance Modelling Platform (GRAMP), a web-based modelling platform to link researchers with appropriate datasets, models and training material. It will provide access to model source code and an interactive platform for researchers to form a consensus on existing methods, and to synthesize new ideas, which will help to advance progress in this area. The platform will eventually support a variety of models, but to trial the platform and test the architecture and functionality, it was piloted with variants of the DNDC model. The intention is to form a worldwide collaborative network (a virtual laboratory) via an interactive website with access to models and best practice guidelines; appropriate datasets for testing, calibrating and evaluating models; on-line tutorials and links to modelling and data provider research groups, and their associated publications. A graphical user interface has been designed to view the model development tree and access all of the above functions.