847 resultados para Technical advances


Relevância:

60.00% 60.00%

Publicador:

Resumo:

A large body of published work shows that proton (hydrogen 1 [(1)H]) magnetic resonance (MR) spectroscopy has evolved from a research tool into a clinical neuroimaging modality. Herein, the authors present a summary of brain disorders in which MR spectroscopy has an impact on patient management, together with a critical consideration of common data acquisition and processing procedures. The article documents the impact of (1)H MR spectroscopy in the clinical evaluation of disorders of the central nervous system. The clinical usefulness of (1)H MR spectroscopy has been established for brain neoplasms, neonatal and pediatric disorders (hypoxia-ischemia, inherited metabolic diseases, and traumatic brain injury), demyelinating disorders, and infectious brain lesions. The growing list of disorders for which (1)H MR spectroscopy may contribute to patient management extends to neurodegenerative diseases, epilepsy, and stroke. To facilitate expanded clinical acceptance and standardization of MR spectroscopy methodology, guidelines are provided for data acquisition and analysis, quality assessment, and interpretation. Finally, the authors offer recommendations to expedite the use of robust MR spectroscopy methodology in the clinical setting, including incorporation of technical advances on clinical units. © RSNA, 2014 Online supplemental material is available for this article.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The advent of single molecule fluorescence microscopy has allowed experimental molecular biophysics and biochemistry to transcend traditional ensemble measurements, where the behavior of individual proteins could not be precisely sampled. The recent explosion in popularity of new super-resolution and super-localization techniques coupled with technical advances in optical designs and fast highly sensitive cameras with single photon sensitivity and millisecond time resolution have made it possible to track key motions, reactions, and interactions of individual proteins with high temporal resolution and spatial resolution well beyond the diffraction limit. Within the purview of membrane proteins and ligand gated ion channels (LGICs), these outstanding advances in single molecule microscopy allow for the direct observation of discrete biochemical states and their fluctuation dynamics. Such observations are fundamentally important for understanding molecular-level mechanisms governing these systems. Examples reviewed here include the effects of allostery on the stoichiometry of ligand binding in the presence of fluorescent ligands; the observation of subdomain partitioning of membrane proteins due to microenvironment effects; and the use of single particle tracking experiments to elucidate characteristics of membrane protein diffusion and the direct measurement of thermodynamic properties, which govern the free energy landscape of protein dimerization. The review of such characteristic topics represents a snapshot of efforts to push the boundaries of fluorescence microscopy of membrane proteins to the absolute limit.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

African trypanosomes, which divide their life cycle between mammals and tsetse flies, are confronted with environments that differ widely in temperature, nutrient availability and host responses to infection. In particular, since trypanosomes cannot predict when they will be transmitted between hosts, it is vital for them to be able to sense and adapt to their milieu. Thanks to technical advances, significant progress has been made in understanding how the parasites perceive external stimuli and react to them. There is also a growing awareness that trypanosomes use a variety of mechanisms to exchange information with each other, thereby enhancing their chances of survival.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En el paradigma clásico, los efectos biológicos de la radiación ionizante se atribuyen al daño en el ADN inducido en cada célula irradiada. La demostración de efectos de vecindad causados por radiación ionizante (EVIR) ha generado un cambio profundo en la concepción actual de la radiobiología. Los EVIR son aquellos efectos causados por la radiación que se producen en células que no han sido irradiadas. Diversos avances técnicos, en particular el empleo de microhaces, han permitido estudiar los EVIR in vitro. Se conocen dos vías por las cuales las células irradiadas pueden comunicarse con las no irradiadas, a saber: mediante uniones especializadas (nexos) que comunican los citoplasmas de células adyacentes, y mediante la secreción de factores solubles al medio extracelular. Estos factores incluyen varias citokinas y especies reactivas del oxígeno y nitrógeno. Las vías de señalización en las células afectadas involucran en particular la activación de proteína kinasas activadas por mitógenos (MAPK) y del factor de transcripción NFciclooxigenasa 2, sintasa de óxido nítrico 2 y NAD(P)H oxidasa. Los EVIR pueden causar mutaciones puntuales y cambios epigenéticos. Los efectos sobre las vías de señalización pueden persistir indefinidamente e incluso transmitirse a la descendencia. Paradójicamente, en ciertas condiciones los EVIR pueden ser adaptativos, es decir que tornan a las células afectadas más resistentes a la radiación. La adaptación exige síntesis de proteínas y mejora la capacidad celular de reparar el ADN y resistir el estrés oxidativo. Los EVIR también se han demostrado in vivo. Por tanto, pueden tener implicaciones importantes en radioterapia, tanto para mejorar la eficacia terapéutica como para reducir la incidencia de efectos adversos. Asimismo, su mejor conocimiento puede influenciar las normas internacionales de radioprotección.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the middle of the twentieth century, Rafael Lorente de Nó (1902?1990) introduced the fundamental concept of the ?elementary cortical unit of operation,? proposing that the cerebral cortex is formed of small cylinders containing vertical chains of neurons (Lorente de Nó, 1933, 1938). On the basis of this idea, the hypothesis was later developed of the columnar organization of the cerebral cortex, primarily following the physiological and anatomical studies of Vernon Mountcastle, David Hubel, Torsten Wiesel, János Szentágothai, Ted Jones, and Pasko Rakic (for a review of these early studies, see Mountcastle, 1998). The columnar organization hypothesis is currently the most widely adopted to explain the cortical processing of information, making its study of potential interest to any researcher interested in this tissue, both in a healthy and pathological state. However, it is frequently remarked that the nomenclature surrounding this hypothesis often generates problems, as the term ?Column? is used freely and promiscuously to refer to multiple, distinguishable entities, such as cellular or dendritic minicolumns or afferent macrocolumns, with respective diameters of menor que50 and 200?500 ?m. Another problem is the degree to which classical criteria may need to be modified (shared response properties, shared input, and common output) and if so, how. Moreover, similar problems arise when we consider the need to define area-specific and species-specific variations. Finally, and what is more an ultimate goal than a problem, it is still necessary to achieve a better fundamental understanding of what columns are and how they are used in cortical processes. Accordingly, it is now very important to translate recent technical advances and new findings in the neurosciences into practical applications for neuroscientists, clinicians, and for those interested in comparative anatomy and brain evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La hipótesis de partida de esta tesis es que existe una influencia directa entre la técnica que se desarrolla durante el segundo período del s. XIX y el cambio sustancial de los conceptos y de la teoría arquitectónica que se da en esos mismos años. Dicha influencia genera nuevos modelos constructivos que serán la base de la arquitectura moderna. Para llegar a la confirmación de la hipótesis, se han planteado tres objetivos principales que responden a los tres capítulos principales de la misma. En primera instancia, se establecen las condiciones teóricas del debate arquitectónico entre las fechas estudiadas. Se analiza el concepto de modelo y «estilo» para evaluar si, efectivamente, hubo cambio o no. También se analizará si los arquitectos eran conscientes de la necesidad de una nueva arquitectura que respondiese a la funcionalidad que el progreso requería. Para comprobar que dicho cambio se ha producido a todos los niveles hasta el punto de redefinir el modelo constructivo se escoge un ejemplo práctico que cumpla las condiciones necesarias para sustentar o no la tesis y se investiga sobre si sucede o no el cambio. A continuación, se analizará la situación de la técnica y de los avances tecnológicos en el período estudiado. Es importante contrastar que realmente hay un conocimiento y una conciencia verdadera de cambio entre los teóricos y los arquitectos que están construyendo en ese momento. Confirmar que dicha conexión existe es vital para la investigación; para eso, se analizará y se profundizará en las conexiones entre la arquitectura y la ingeniería (o los avances tecnológicos) para entender la implicación de uno con el otro. Con este fin, se han estudiado las distintas publicaciones periódicas de la época; sobre todo la más relevante y la que primero se editó, que fue La revue générale de l’architecture; me he apoyado en ella. Es un documento que mensualmente recoge cambios, avances y debates. Tuvo una grandísima divulgación; todo docente, arquitecto e ingeniero de la época tenía un acceso fácil y directo a dicha publicación. Por último, a raíz de una construcción ideal proyectada por uno de los grandes arquitectos teóricos de la época, se reflexiona sobre si esa arquitectura supone realmente una comprensión y una conciencia de los nuevos modelos arquitectónicos, definidos por una técnica en progreso, o si responde exclusivamente a una utopía arquitectónica como tantas otras se habían esbozado con anterioridad por grandes teóricos. Para poder definir la geometría de este elemento estructural, se ha realizado un modelado tridimensional que permite calcular la estructura a través de un proceso de elementos finitos. El propósito de este cálculo no es exclusivamente saber si la estructura era viable o no en aquella época, sino también comprender si la definición de estos originales elementos estructurales implicaba una concepción nueva de las estructuras tridimensionales y eso significa que empezaba a germinar un cambio sustancial en la teoría de la arquitectura, con la búsqueda de un nuevo modelo constructivo derivado de los avances técnicos. De este modo queda demostrado que el modelo constructivo ha cambiado definitivamente por un modelo mucho más versátil y sobre todo con una gran capacidad de adaptación, esto combinado a su vez con una proliferación de patentes que intentan y buscan una estandarización del modelo constructivo y que no son, en ningún caso, un modelo en sí. Como última conclusión, queda argumentado que en aquella época ya se tenía una gran intuición estructural; a pesar de que el cálculo matemático era todavía algo relativamente nuevo, se manejaban de forma instintiva líneas de cargas, combinación de materiales y nuevas formas, que ayudarán a crear los modelos de los próximos años y que serán la base del cálculo numérico posterior. ABSTRACT The hypothesis of this thesis is that there is a direct influence between the technique developed during the second period of the XIX century and the substantial change in the concepts and in the architectural theory that occurs in those years. This influence develops new building models that will be the basis of modern architecture. To confirm this hypothesis, we present three principal objectives that corresponds to the three principals chapsters of the text. First, we establish the theoretical conditions of the architectural debate between the dates studied. We analyze the concepts of model and “style” to assess whether there was a change or not. We consider as well if architects were aware of the need for a new architecture to respond to the functionality needs that progress demanded. To verify that the change occurred at all levels to the extent of redefining the building model we choose a practical example that fulfills the necessary conditions to support or not the thesis and we investigate whether or not the change happens. Next, we analyze the status of technical and technological advances in the study period of study. It is important to contrast that there is a real knowledge and awareness of change between the theorists and architects who are building at the time. Confirming that that connection exists is vital for the research; for that, we will analyze and deepen into the connections between architecture and engineering (or technological progress) to understand the implication of one into the other. To this end, we have studied various publications of the time; especially the most relevant and the first published, La Revue générale de l’architecture; I have relied on it. This is a monthly document that includes changes, developments and debates. It had a very great disclosure; every teacher, architect and engineer of the time had a direct and easy access to this publication. Finally, following theoretical ideal construction projected by one of the great architects of the time, we reflect on whether that architecture really represents an understanding and awareness of the new architectural models defined by a technique in progress, or if it only responds to an architectural utopia as the ones outlined earlier by the great theorists. To be able to define the geometry of this structural item, we have carried out a three-dimensional modeling that enables us to calculate the structure through a process of finite elements. The purpose of this calculation is not exclusively understanding whether the structure was viable or not at the time, but also understanding if the definition of these original structural elements involved a new concept of three-dimensional structures and if that involved the beginning of a substantial change in theory of architecture, with the search for a new construction model derived from technical advances. In this way it is demonstrated that the building model has definitely changed for a much more versatile one and above all with a high adaptation capacity, this combined in turn with a proliferation of patents that try and seek a standardization of the building model and are not, in any case, a model in itself. As a final conclusion, it is argued that at the time a major structural intuition was present; even though the math calculation was still relatively new, in an instinctively way load lines, combination of materials and new forms were taken into account, that will help to create models in the coming years and which will form the basis of subsequent numerical calculation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synaptic plasticity is the dynamic regulation of the strength of synaptic communication between nerve cells. It is central to neuronal development as well as experience-dependent remodeling of the adult nervous system as occurs during memory formation. Aberrant forms of synaptic plasticity also accompany a variety of neurological and psychiatric diseases, and unraveling the biological basis of synaptic plasticity has been a major goal in neurobiology research. The biochemical and structural mechanisms underlying different forms of synaptic plasticity are complex, involving multiple signaling cascades, reconfigurations of structural proteins and the trafficking of synaptic proteins. As such, proteomics should be a valuable tool in dissecting the molecular events underlying normal and disease-related forms of plasticity. In fact, progress in this area has been disappointingly slow. We discuss the particular challenges associated with proteomic interrogation of synaptic plasticity processes and outline ways in which we believe proteomics may advance the field over the next few years. We pay particular attention to technical advances being made in small sample proteomics and the advent of proteomic imaging in studying brain plasticity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the absence of external stimuli, the mammalian brain continues to display a rich variety of spontaneous activity. Such activity is often highly stereotypical, is invariably rhythmic, and can occur with periodicities ranging from a few milliseconds to several minutes. Recently, there has been a particular resurgence of interest in fluctuations in brain activity occurring at <0.1 Hz, commonly referred to as very slow or infraslow oscillations (ISOs). Whilst this is primarily due to the emergence of functional magnetic resonance imaging (fMRI) as a technique which has revolutionized the study of human brain dynamics, it is also a consequence of the application of full band electroencephalography (fbEEG). Despite these technical advances, the precise mechanisms which lead to ISOs in the brain remain unclear. In a host of animal studies, one brain region that consistently shows oscillations at <0.1 Hz is the thalamus. Importantly, similar oscillations can also be observed in slices of isolated thalamic relay nuclei maintained in vitro. Here, we discuss the nature and mechanisms of these oscillations, paying particular attention to a potential role for astrocytes in their genesis. We also highlight the relationship between this activity and ongoing local network oscillations in the alpha (a; ~8-13 Hz) band, drawing clear parallels with observations made in vivo. Last, we consider the relevance of these thalamic ISOs to the pathological activity that occurs in certain types of epilepsy.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Concerns over dwindling oil reserves, carbon dioxide emissions from fossil fuel sources and associated climate change is driving the urgent need for clean, renewable energy supplies. The conversion of triglycerides to biodiesel via catalytic transesterification remains an energetically efficient and attractive means to generate transportation fuel1. However, current biodiesel manufacturing routes employing soluble alkali based catalysts are very energy inefficient producing copious amounts of contaminated water waste during fuel purification. Technical advances in catalyst and reactor design and introduction of non-food based feedstocks are thus required to ensure that biodiesel remains a key player in the renewable energy sector for the 21st century. This presentation will give an overview of some recent developments in the design of solid acid and base catalysts for biodiesel synthesis. A particular focus will be on the benefits of designing materials with interconnected hierarchical macro-mesoporous networks to enhance mass-transport of viscous plant oils during reaction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The combination of dwindling oil reserves and growing concerns over carbon dioxide emissions and associated climate change is driving the urgent development of clean, sustainable energy supplies. Biodiesel is non-toxic and biodegradable, with the potential for closed CO2 cycles and thus vastly reduced carbon footprints compared with petroleum fuels. However, current manufacturing routes employing soluble catalysts are very energy inefficient and produce copious amounts of contaminated water waste. This review highlights the significant progress made in recent years towards developing solid acid and base catalysts for biodiesel synthesis. Issues to be addressed in the future are also discussed including the introduction of non-edible oil feedstocks, as well as technical advances in catalyst and reactor design to ensure that biodiesel remains a key player in the renewable energy sector for the 21st century.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The combination of dwindling oil reserves and growing concerns over carbon dioxide emissions and associated climate change is driving the urgent development of clean, sustainable energy supplies. Biodiesel is a non-toxic and biodegradable fuel, with the potential for closed CO2 cycles and thus vastly reduced carbon footprints compared with petroleum. However, current manufacturing routes employing soluble catalysts are very energy inefficient, with their removal necessitating an energy intensive separation to purify biodiesel, which in turn produces copious amounts of contaminated aqueous waste. The introduction of non-food based feedstocks and technical advances in heterogeneous catalyst and reactor design are required to ensure that biodiesel remains a key player in the renewable energy sector for the 21st century. Here we report on the development of tuneable solid acid and bases for biodiesel synthesis, which offer several process advantages by eliminating the quenching step and allowing operation in a continuous reactor. Significant progress has been made towards developing tuneable solid base catalysts for biodiesel synthesis, including Li/CaO [1], Mg-Al hydrotalcites [2] and calcined dolomite [3] which exhibit excellent activity for triglyceride transesterification. However, the effects of solid base strength on catalytic activity in biodiesel synthesis remains poorly understood, hampering material optimisation and commercial exploitation. To improve our understanding of factors influencing solid base catalysts for biodiesel synthesis, we have applied a simple spectroscopic method for the quantitative determination of surface basicity which is independent of adsorption probes. Such measurements reveal how the morphology and basicity of MgO nanocrystals correlate with their biodiesel synthesis activity [4]. While diverse solid acids and bases have been investigated for TAG transesterification, the micro and mesoporous nature of catalyst systems investigated to date are not optimal for the diffusion of bulky and viscous C16-C18 TAGs typical of plant oils. The final part of this presentation will address the benefits of designing porous networks comprising interconnected hierarchical macroporous and mesoporous channels (Figure 1) to enhance mass-transport properties of viscous plant oils during biodiesel synthesis [5]. References: [1] R.S. Watkins, A.F. Lee, K. Wilson, Green Chem., 2004, 6, 335. [2]D.G. Cantrell, L.J. Gillie, A.F. Lee and K. Wilson, Appl. Catal. A, 2005, 287,183. [3] C. Hardacre, A.F. Lee, J.M. Montero, L. Shellard, K.Wilson, Green Chem., 2008, 10, 654. [4] J.M. Montero, P.L. Gai, K. Wilson, A.F. Lee, Green Chem., 2009, 11, 265. [5] J. Dhainaut, J.-P. Dacquin, A.F. Lee, K. Wilson, Green Chem., 2010, 12, 296.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Transconjunctival microincision vitrectomy surgery (MIVS) has grown increasingly popular among vitreoretinal surgeons over the last few years. Technical advances have led to the development of cutting-edge vitrectomy systems and instruments that significantly contributed to the success of MIVS. Trocar evolution has added extra safeness and effectiveness to the technique. In the hands of an experienced surgeon, microincision vitrectomy trocars offer a new range of applications that can redefine surgical practices and facilitate otherwise complex surgical techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Transconjunctival microincision vitrectomy surgery (MIVS) has grown increasingly popular among vitreoretinal surgeons over the last few years. Technical advances have led to the development of cutting-edge vitrectomy systems and instruments that significantly contributed to the success of MIVS. Trocar evolution has added extra safeness and effectiveness to the technique. In the hands of an experienced surgeon, microincision vitrectomy trocars offer a new range of applications that can redefine surgical practices and facilitate otherwise complex surgical techniques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This handbook is the original papers on experiences, innovations, reviews, technical advances and trends in theory and practice of professionals in information science (LIS, documentation, information, communication, etc..).Its purpose is to serve as a dissemination tool, discussion forum, a means of supporting the professional development and continuing education, experience-sharing tool, and a window to understanding the changes that occur in the professional environment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.