70 resultados para TECHNOLOGICAL ADVANCES
em Université de Lausanne, Switzerland
Resumo:
Human genetics has progressed at an unprecedented pace during the past 10 years. DNA microarrays currently allow screening of the entire human genome with high level of coverage and we are now entering the era of high-throughput sequencing. These remarkable technical advances are influencing the way medical research is conducted and have boosted our understanding of the structure of the human genome as well as of disease biology. In this context, it is crucial for clinicians to understand the main concepts and limitations of modern genetics. This review will describe key concepts in genetics, including the different types of genetic markers in the human genome, review current methods to detect DNA variation, describe major online public databases in genetics, explain key concepts in statistical genetics and finally present commonly used study designs in clinical and epidemiological research. This review will therefore concentrate on human genetic variation analysis.
Resumo:
Recent technological advances in remote sensing have enabled investigation of the morphodynamics and hydrodynamics of large rivers. However, measuring topography and flow in these very large rivers is time consuming and thus often constrains the spatial resolution and reach-length scales that can be monitored. Similar constraints exist for computational fluid dynamics (CFD) studies of large rivers, requiring maximization of mesh-or grid-cell dimensions and implying a reduction in the representation of bedform-roughness elements that are of the order of a model grid cell or less, even if they are represented in available topographic data. These ``subgrid'' elements must be parameterized, and this paper applies and considers the impact of roughness-length treatments that include the effect of bed roughness due to ``unmeasured'' topography. CFD predictions were found to be sensitive to the roughness-length specification. Model optimization was based on acoustic Doppler current profiler measurements and estimates of the water surface slope for a variety of roughness lengths. This proved difficult as the metrics used to assess optimal model performance diverged due to the effects of large bedforms that are not well parameterized in roughness-length treatments. However, the general spatial flow patterns are effectively predicted by the model. Changes in roughness length were shown to have a major impact upon flow routing at the channel scale. The results also indicate an absence of secondary flow circulation cells in the reached studied, and suggest simpler two-dimensional models may have great utility in the investigation of flow within large rivers. Citation: Sandbach, S. D. et al. (2012), Application of a roughness-length representation to parameterize energy loss in 3-D numerical simulations of large rivers, Water Resour. Res., 48, W12501, doi: 10.1029/2011WR011284.
Resumo:
The lymphatic vascular system, the body's second vascular system present in vertebrates, has emerged in recent years as a crucial player in normal and pathological processes. It participates in the maintenance of normal tissue fluid balance, the immune functions of cellular and antigen trafficking and absorption of fatty acids and lipid-soluble vitamins in the gut. Recent scientific discoveries have highlighted the role of lymphatic system in a number of pathologic conditions, including lymphedema, inflammatory diseases, and tumor metastasis. Development of genetically modified animal models, identification of lymphatic endothelial specific markers and regulators coupled with technological advances such as high-resolution imaging and genome-wide approaches have been instrumental in understanding the major steps controlling growth and remodeling of lymphatic vessels. This review highlights the recent insights and developments in the field of lymphatic vascular biology.
Resumo:
While knowledge about standardization of skin protection against ultraviolet radiation (UVR) has progressed over the past few decades, there is no uniform and generally accepted standardized measurement for UV eye protection. The literature provides solid evidence that UV can induce considerable damage to structures of the eye. As well as damaging the eyelids and periorbital skin, chronic UV exposure may also affect the conjunctiva and lens. Clinically, this damage can manifest as skin cancer and premature skin ageing as well as the development of pterygia and premature cortical cataracts. Modern eye protection, used daily, offers the opportunity to prevent these adverse sequelae of lifelong UV exposure. A standardized, reliable and comprehensive label for consumers and professionals is currently lacking. In this review we (i) summarize the existing literature about UV radiation-induced damage to the eye and surrounding skin; (ii) review the recent technological advances in UV protection by means of lenses; (iii) review the definition of the Eye-Sun Protection Factor (E-SPF®), which describes the intrinsic UV protection properties of lenses and lens coating materials based on their capacity to absorb or reflect UV radiation; and (iv) propose a strategy for establishing the biological relevance of the E-SPF.
Resumo:
The aim of this work is to compare two methods used for determining the proper shielding of computed tomography (CT) rooms while considering recent technological advances in CT scanners. The approaches of the German Institute for Standardisation and the US National Council on Radiation Protection and Measurements were compared and a series of radiation measurements were performed in several CT rooms at the Lausanne University Hospital. The following three-step procedure is proposed for assuring sufficient shielding of rooms hosting new CT units with spiral mode acquisition and various X-ray beam collimation widths: (1) calculate the ambient equivalent dose for a representative average weekly dose length product at the position where shielding is required; (2) from the maximum permissible weekly dose at the location of interest, calculate the transmission factor F that must be taken to ensure proper shielding and (3) convert the transmission factor into a thickness of lead shielding. A similar approach could be adopted to use when designing shielding for fluoroscopy rooms, where the basic quantity would be the dose area product instead of the load of current (milliampere-minute).
Resumo:
In recent years, analysis of the genomes of many organisms has received increasing international attention. The bulk of the effort to date has centred on the Human Genome Project and analysis of model organisms such as yeast, Drosophila and Caenorhabditis elegans. More recently, the revolution in genome sequencing and gene identification has begun to impact on infectious disease organisms. Initially, much of the effort was concentrated on prokaryotes, but small eukaryotic genomes, including the protozoan parasites Plasmodium, Toxoplasma and trypanosomatids (Leishmania, Trypanosoma brucei and T. cruzi), as well as some multicellular organisms, such as Brugia and Schistosoma, are benefiting from the technological advances of the genome era. These advances promise a radical new approach to the development of novel diagnostic tools, chemotherapeutic targets and vaccines for infectious disease organisms, as well as to the more detailed analysis of cell biology and function.Several networks or consortia linking laboratories around the world have been established to support these parasite genome projects[1] (for more information, see http://www.ebi.ac.uk/ parasites/paratable.html). Five of these networks were supported by an initiative launched in 1994 by the Specific Programme for Research and Tropical Diseases (TDR) of the WHO[2, 3, 4, 5, 6]. The Leishmania Genome Network (LGN) is one of these[3]. Its activities are reported at http://www.ebi.ac.uk/parasites/leish.html, and its current aim is to map and sequence the genome of Leishmania by the year 2002. All the mapping, hybridization and sequence data are also publicly available from LeishDB, an AceDB-based genome database (http://www.ebi.ac.uk/parasites/LGN/leissssoft.html).
Resumo:
Extracellular calcium participates in several key physiological functions, such as control of blood coagulation, bone calcification or muscle contraction. Calcium homeostasis in humans is regulated in part by genetic factors, as illustrated by rare monogenic diseases characterized by hypo or hypercalcaemia. Both serum calcium and urinary calcium excretion are heritable continuous traits in humans. Serum calcium levels are tightly regulated by two main hormonal systems, i.e. parathyroid hormone and vitamin D, which are themselves also influenced by genetic factors. Recent technological advances in molecular biology allow for the screening of the human genome at an unprecedented level of detail and using hypothesis-free approaches, such as genome-wide association studies (GWAS). GWAS identified novel loci for calcium-related phenotypes (i.e. serum calcium and 25-OH vitamin D) that shed new light on the biology of calcium in humans. The substantial overlap (i.e. CYP24A1, CASR, GATA3; CYP2R1) between genes involved in rare monogenic diseases and genes located within loci identified in GWAS suggests a genetic and phenotypic continuum between monogenic diseases of calcium homeostasis and slight disturbances of calcium homeostasis in the general population. Future studies using whole-exome and whole-genome sequencing will further advance our understanding of the genetic architecture of calcium homeostasis in humans. These findings will likely provide new insight into the complex mechanisms involved in calcium homeostasis and hopefully lead to novel preventive and therapeutic approaches. Keyword: calcium, monogenic, genome-wide association studies, genetics.
Resumo:
Parallel tracks for clinical scientists, basic scientists, and pediatric imagers was the novel approach taken for the highly successful 8th Annual Scientific Sessions of the Society for Cardiovascular Magnetic Resonance, held in San Francisco, California, January 21 to 23, 2005. Attendees were immersed in information on the latest scientific advances in cardiovascular magnetic resonance (CMR) from mice to man and technological advances from systems with field strengths from 0.5 T to 11.7 T. State-of-the-art applications were reviewed, spanning a wide range from molecular imaging to predicting outcome with CMR in large patient populations.
Resumo:
Universal standard goniometer is an essential tool to measure articulations' range of motion (ROM). In this time of technological advances and increasing use of smartphones, new measurement's tools appear as specific smartphone applications. This article compares the iOS application "Knee Goniometer" with universal standard goniometer to assess knee ROM. To our knowledge, this is the first study that uses a goniometer application in a clinical context. The purpose of this study is to determine if this application could be used in clinical practice.
Resumo:
The energy demands of the brain are high: they account for at least 20% of the body's energy consumption. Evolutionary studies indicate that the emergence of higher cognitive functions in humans is associated with an increased glucose utilization and expression of energy metabolism genes. Functional brain imaging techniques such as fMRI and PET, which are widely used in human neuroscience studies, detect signals that monitor energy delivery and use in register with neuronal activity. Recent technological advances in metabolic studies with cellular resolution have afforded decisive insights into the understanding of the cellular and molecular bases of the coupling between neuronal activity and energy metabolism and point at a key role of neuron-astrocyte metabolic interactions. This article reviews some of the most salient features emerging from recent studies and aims at providing an integration of brain energy metabolism across resolution scales.
Resumo:
Since the first implantation of an endograft in 1991, endovascular aneurysm repair (EVAR) rapidly gained recognition. Historical trials showed lower early mortality rates but these results were not maintained beyond 4 years. Despite newer-generation devices, higher rates of reintervention are associated with EVAR during follow-up. Therefore, the best therapeutic decision relies on many parameters that the physician has to take in consideration. Patient's preferences and characteristics are important, especially age and life expectancy besides health status. Aneurysmal anatomical conditions remain probably the most predictive factor that should be carefully evaluated to offer the best treatment. Unfavorable anatomy has been observed to be associated with more complications especially endoleak, leading to more re-interventions and higher risk of late mortality. Nevertheless, technological advances have made surgeons move forward beyond the set barriers. Thus, more endografts are implanted outside the instructions for use despite excellent results after open repair especially in low-risk patients. When debating about AAA repair, some other crucial points should be analysed. It has been shown that strict surveillance is mandatory after EVAR to offer durable results and prevent late rupture. Such program is associated with additional costs and with increased risk of radiation. Moreover, a risk of loss of renal function exists when repetitive imaging and secondary procedures are required. The aim of this article is to review the data associated with abdominal aortic aneurysm and its treatment in order to establish selection criteria to decide between open or endovascular repair.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been used successfully in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits; to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
In recent years, technological advances have allowed manufacturers to implement dual-energy computed tomography (DECT) on clinical scanners. With its unique ability to differentiate basis materials by their atomic number, DECT has opened new perspectives in imaging. DECT has been successfully used in musculoskeletal imaging with applications ranging from detection, characterization, and quantification of crystal and iron deposits, to simulation of noncalcium (improving the visualization of bone marrow lesions) or noniodine images. Furthermore, the data acquired with DECT can be postprocessed to generate monoenergetic images of varying kiloelectron volts, providing new methods for image contrast optimization as well as metal artifact reduction. The first part of this article reviews the basic principles and technical aspects of DECT including radiation dose considerations. The second part focuses on applications of DECT to musculoskeletal imaging including gout and other crystal-induced arthropathies, virtual noncalcium images for the study of bone marrow lesions, the study of collagenous structures, applications in computed tomography arthrography, as well as the detection of hemosiderin and metal particles.
Resumo:
BACKGROUND: Living in a multisensory world entails the continuous sensory processing of environmental information in order to enact appropriate motor routines. The interaction between our body and our brain is the crucial factor for achieving such sensorimotor integration ability. Several clinical conditions dramatically affect the constant body-brain exchange, but the latest developments in biomedical engineering provide promising solutions for overcoming this communication breakdown. NEW METHOD: The ultimate technological developments succeeded in transforming neuronal electrical activity into computational input for robotic devices, giving birth to the era of the so-called brain-machine interfaces. Combining rehabilitation robotics and experimental neuroscience the rise of brain-machine interfaces into clinical protocols provided the technological solution for bypassing the neural disconnection and restore sensorimotor function. RESULTS: Based on these advances, the recovery of sensorimotor functionality is progressively becoming a concrete reality. However, despite the success of several recent techniques, some open issues still need to be addressed. COMPARISON WITH EXISTING METHOD(S): Typical interventions for sensorimotor deficits include pharmaceutical treatments and manual/robotic assistance in passive movements. These procedures achieve symptoms relief but their applicability to more severe disconnection pathologies is limited (e.g. spinal cord injury or amputation). CONCLUSIONS: Here we review how state-of-the-art solutions in biomedical engineering are continuously increasing expectances in sensorimotor rehabilitation, as well as the current challenges especially with regards to the translation of the signals from brain-machine interfaces into sensory feedback and the incorporation of brain-machine interfaces into daily activities.