41 resultados para Hardware and Architecture
em Université de Lausanne, Switzerland
Resumo:
Coronary magnetic resonance angiography (MRA) is a technique aimed at establishing a noninvasive test for the assessment of significant coronary stenoses. There are certain boundary conditions that have hampered the clinical success of coronary MRA and coronary vessel wall imaging. Recent advances in hardware and software allow for consistent visualization of the proximal and mid portions of the native coronary arteries. Current research focuses on the use of intravascular MR contrast agents and black blood coronary angiography. One common goal is to create a noninvasive test which might allow for screening for major proximal and mid coronary artery disease. These novel approaches will represent a major step forward in diagnostic cardiology.
Resumo:
Odorous chemicals are detected by the mouse main olfactory epithelium (MOE) by about 1100 types of olfactory receptors (OR) expressed by olfactory sensory neurons (OSNs). Each mature OSN is thought to express only one allele of a single OR gene. Major impediments to understand the transcriptional control of OR gene expression are the lack of a proper characterization of OR transcription start sites (TSSs) and promoters, and of regulatory transcripts at OR loci. We have applied the nanoCAGE technology to profile the transcriptome and the active promoters in the MOE. nanoCAGE analysis revealed the map and architecture of promoters for 87.5% of the mouse OR genes, as well as the expression of many novel noncoding RNAs including antisense transcripts. We identified candidate transcription factors for OR gene expression and among them confirmed by chromatin immunoprecipitation the binding of TBP, EBF1 (OLF1), and MEF2A to OR promoters. Finally, we showed that a short genomic fragment flanking the major TSS of the OR gene Olfr160 (M72) can drive OSN-specific expression in transgenic mice.
Resumo:
Background: To report a single-center experience in 19 patients (pts) with anal canal cancer treated with helical tomotherapy (HT) and concurrent chemotherapy, and compare the dosimetric results with fixed-field intensitymodulated radiotherapy (IMRT) and 3D conformal radiotherapy (3D RT). Materials and Methods: Between 2007 and 2008, 19 consecutive pts were treated with HT and concurrent CT for anal canal cancer. Median age was 59 years (range, 38−83), and female/male ratio was 14/5. The majority of the pts had T2 or T3 tumours (68.4%), and 52.6% had positive lymph nodes. In all 19 pts, pelvic and inguinal nodes, and tumour irradiation was given using HT upto a median dose of 36 Gy (1.8 Gy/fr) followed by a 1-week gap. A boost dose of 23.4 Gy (1.8 Gy/fr) was delivered to the tumour and involved nodes using 3DRT (n = 12), HT (n = 6), or IMRT (n = 1). Simultaneous integrated boost was used in none of the pts. All but one patient with a T1N0 tumour received concomitant mitomycin/5- fluorouracil (n = 12) or mitomycin/capecitabin (n = 7) CT. Toxicity was scored according to the Common Terminology Criteria for Adverse Events (NCICTCAE v3.0). HT plans and treatments were generated using Tomotherapy, Inc., software and hardware; and 3D or IMRT boost plans with the CMS treatment planning system (TPS), using 6−18 MV photons from a Siemens Primus accelerator. For dosimetric comparison, computed tomography data sets of 10 pts were imported into the TPS, and 3D and 5-field step-andshoot IMRT plans were generated for each case. Plans were optimized with the aim of assessing organs at risk (OAR) and healthy-tissue sparing while enforcing highly conformal target coverage, and evaluated by dose-volume histograms (DVH) of planning target volumes (PTV) and OAR. Results: With a median follow-up of 13 months (range, 3−18), all pts are alive and well; except one patient developing local recurrence at 12 months. No patient developed grade 3 or more acute toxicity. No unplanned treatment interruption was necessary because of toxicity. With 360-degree-of-freedom beam projection, HT showed an advantage over 3D or IMRT plans in terms of dose conformity around the PTV, and dose gradients were steeper outside the PTV, resulting in reduced doses to OARs. Using HT, acute toxicity was acceptable, and seemed to be better than historical standards. Conclusion: We conclude that HT combined with concurrent chemotherapy for anal canal cancer is effective and tolerable. Compared to 3DRT or 5-field IMRT, there is better conformity around the PTV, and OAR sparing.
Resumo:
This study intended to compare bone density and architecture in three groups of women: young women with anorexia nervosa (AN), an age-matched control group of young women, and healthy late postmenopausal women. Three-dimensional peripheral quantitative high resolution computed-tomography (HR-pQCT) at the ultradistal radius, a technology providing measures of cortical and trabecular bone density and microarchitecture, was performed in the three cohorts. Thirty-six women with AN aged 18-30years (mean duration of AN: 5.8years), 83 healthy late postmenopausal women aged 70-81 as well as 30 age-matched healthy young women were assessed. The overall cortical and trabecular bone density (D100), the absolute thickness of the cortical bone (CTh), and the absolute number of trabecules per area (TbN) were significantly lower in AN patients compared with healthy young women. The absolute number of trabecules per area (TbN) in AN and postmenopausal women was similar, but significantly lower than in healthy young women. The comparison between AN patients and post-menopausal women is of interest because the latter reach bone peak mass around the middle of the fertile age span whereas the former usually lose bone before reaching optimal bone density and structure. This study shows that bone mineral density and bone compacta thickness in AN are lower than those in controls but still higher than those in postmenopause. Bone compacta density in AN is similar as in controls. However, bone inner structure in AN is degraded to a similar extent as in postmenopause. This last finding is particularly troubling.
Resumo:
Drosophila melanogaster is a model organism instrumental for numerous biological studies. The compound eye of this insect consists of some eight hundred individual ommatidia or facets, ca. 15 µm in cross-section. Each ommatidium contains eighteen cells including four cone cells secreting the lens material (cornea). High-resolution imaging of the cornea of different insects has demonstrated that each lens is covered by the nipple arrays--small outgrowths of ca. 200 nm in diameter. Here we for the first time utilize atomic force microscopy (AFM) to investigate nipple arrays of the Drosophila lens, achieving an unprecedented visualization of the architecture of these nanostructures. We find by Fourier analysis that the nipple arrays of Drosophila are disordered, and that the seemingly ordered appearance is a consequence of dense packing of the nipples. In contrast, Fourier analysis confirms the visibly ordered nature of the eye microstructures--the individual lenses. This is different in the frizzled mutants of Drosophila, where both Fourier analysis and optical imaging detect disorder in lens packing. AFM reveals intercalations of the lens material between individual lenses in frizzled mutants, providing explanation for this disorder. In contrast, nanostructures of the mutant lens show the same organization as in wild-type flies. Thus, frizzled mutants display abnormal organization of the corneal micro-, but not nano-structures. At the same time, nipples of the mutant flies are shorter than those of the wild-type. We also analyze corneal surface of glossy-appearing eyes overexpressing Wingless--the lipoprotein ligand of Frizzled receptors, and find the catastrophic aberration in nipple arrays, providing experimental evidence in favor of the major anti-reflective function of these insect eye nanostructures. The combination of the easily tractable genetic model organism and robust AFM analysis represents a novel methodology to analyze development and architecture of these surface formations.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The complex structural organization of the white matter of the brain can be depicted in vivo in great detail with advanced diffusion magnetic resonance (MR) imaging schemes. Diffusion MR imaging techniques are increasingly varied, from the simplest and most commonly used technique-the mapping of apparent diffusion coefficient values-to the more complex, such as diffusion tensor imaging, q-ball imaging, diffusion spectrum imaging, and tractography. The type of structural information obtained differs according to the technique used. To fully understand how diffusion MR imaging works, it is helpful to be familiar with the physical principles of water diffusion in the brain and the conceptual basis of each imaging technique. Knowledge of the technique-specific requirements with regard to hardware and acquisition time, as well as the advantages, limitations, and potential interpretation pitfalls of each technique, is especially useful.
Resumo:
In the case of such a very special building project, the crucial stake for sustainable development is the fact that space systems are extreme cases of environmental constraints. In- deed, they constitute an interesting model as an analogy can be made between Martian utmost conditions and some of the possible extreme one's that Earth might soon face. The didactic ob- jective of the project is to use the context of a building on Mars to teach an approach which raises the students awareness to design and plan all steps of a building in a sustainable way, i.e. build, with the available resources, living spaces that satisfy human needs and leave as intact as possible the external environment. The paper presents the approach and the feedback of this student project, more specifically ENAC Learning Unit", which involved 17 students from envi- ronmental, civil engineering and architecture sections from EPFL. All the same, it involved pro- fessors from all three domains, as well as aerospace and Mars specialists, which gave seminars during the course of the semester. The students were separated in groups, and the project con- sisted of two phases: 1) analysis of the context and resources, 2) project design and critic. Both organisational, technical and pedagogical aspects of the experience are presented. The outcome was very positive, with students experiencing for their first time multidisciplinary work and the iterative process of design under multiple constraints.
Resumo:
Current limitations of coronary magnetic resonance angiography (MRA) include a suboptimal signal-to-noise ratio (SNR), which limits spatial resolution and the ability to visualize distal and branch vessel coronary segments. Improved SNR is expected at higher field strengths, which may provide improved spatial resolution. However, a number of potential adverse effects on image quality have been reported at higher field strengths. The limited availability of high-field systems equipped with cardiac-specific hardware and software has previously precluded successful in vivo human high-field coronary MRA data acquisition. In the present study we investigated the feasibility of human coronary MRA at 3.0 T in vivo. The first results obtained in nine healthy adult subjects are presented.
Resumo:
Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme
Resumo:
Following the success of the first round table in 2001, the Swiss Proteomic Society has organized two additional specific events during its last two meetings: a proteomic application exercise in 2002 and a round table in 2003. Such events have as their main objective to bring together, around a challenging topic in mass spectrometry, two groups of specialists, those who develop and commercialize mass spectrometry equipment and software, and expert MS users for peptidomics and proteomics studies. The first round table (Geneva, 2001) entitled "Challenges in Mass Spectrometry" was supported by brief oral presentations that stressed critical questions in the field of MS development or applications (Stöcklin and Binz, Proteomics 2002, 2, 825-827). Topics such as (i) direct analysis of complex biological samples, (ii) status and perspectives for MS investigations of noncovalent peptide-ligant interactions; (iii) is it more appropriate to have complementary instruments rather than a universal equipment, (iv) standardization and improvement of the MS signals for protein identification, (v) what would be the new generation of equipment and finally (vi) how to keep hardware and software adapted to MS up-to-date and accessible to all. For the SPS'02 meeting (Lausanne, 2002), a full session alternative event "Proteomic Application Exercise" was proposed. Two different samples were prepared and sent to the different participants: 100 micro g of snake venom (a complex mixture of peptides and proteins) and 10-20 micro g of almost pure recombinant polypeptide derived from the shrimp Penaeus vannamei carrying an heterogeneous post-translational modification (PTM). Among the 15 participants that received the samples blind, eight returned results and most of them were asked to present their results emphasizing the strategy, the manpower and the instrumentation used during the congress (Binz et. al., Proteomics 2003, 3, 1562-1566). It appeared that for the snake venom extract, the quality of the results was not particularly dependant on the strategy used, as all approaches allowed Lication of identification of a certain number of protein families. The genus of the snake was identified in most cases, but the species was ambiguous. Surprisingly, the precise identification of the recombinant almost pure polypeptides appeared to be much more complicated than expected as only one group reported the full sequence. Finally the SPS'03 meeting reported here included a round table on the difficult and challenging task of "Quantification by Mass Spectrometry", a discussion sustained by four selected oral presentations on the use of stable isotopes, electrospray ionization versus matrix-assisted laser desorption/ionization approaches to quantify peptides and proteins in biological fluids, the handling of differential two-dimensional liquid chromatography tandem mass spectrometry data resulting from high throughput experiments, and the quantitative analysis of PTMs. During these three events at the SPS meetings, the impressive quality and quantity of exchanges between the developers and providers of mass spectrometry equipment and software, expert users and the audience, were a key element for the success of these fruitful events and will have definitively paved the way for future round tables and challenging exercises at SPS meetings.
Resumo:
Imaging in neuroscience, clinical research and pharmaceutical trials often employs the 3D magnetisation-prepared rapid gradient-echo (MPRAGE) sequence to obtain structural T1-weighted images with high spatial resolution of the human brain. Typical research and clinical routine MPRAGE protocols with ~1mm isotropic resolution require data acquisition time in the range of 5-10min and often use only moderate two-fold acceleration factor for parallel imaging. Recent advances in MRI hardware and acquisition methodology promise improved leverage of the MR signal and more benign artefact properties in particular when employing increased acceleration factors in clinical routine and research. In this study, we examined four variants of a four-fold-accelerated MPRAGE protocol (2D-GRAPPA, CAIPIRINHA, CAIPIRINHA elliptical, and segmented MPRAGE) and compared clinical readings, basic image quality metrics (SNR, CNR), and automated brain tissue segmentation for morphological assessments of brain structures. The results were benchmarked against a widely-used two-fold-accelerated 3T ADNI MPRAGE protocol that served as reference in this study. 22 healthy subjects (age=20-44yrs.) were imaged with all MPRAGE variants in a single session. An experienced reader rated all images of clinically useful image quality. CAIPIRINHA MPRAGE scans were perceived on average to be of identical value for reading as the reference ADNI-2 protocol. SNR and CNR measurements exhibited the theoretically expected performance at the four-fold acceleration. The results of this study demonstrate that the four-fold accelerated protocols introduce systematic biases in the segmentation results of some brain structures compared to the reference ADNI-2 protocol. Furthermore, results suggest that the increased noise levels in the accelerated protocols play an important role in introducing these biases, at least under the present study conditions.
Resumo:
Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.