43 resultados para Integrated Hydropyrolysis and Hydroconversion process
Resumo:
This thesis is a compilation of projects to study sediment processes recharging debris flow channels. These works, conducted during my stay at the University of Lausanne, focus in the geological and morphological implications of torrent catchments to characterize debris supply, a fundamental element to predict debris flows. Other aspects of sediment dynamics are considered, e.g. the coupling headwaters - torrent, as well as the development of a modeling software that simulates sediment transfer in torrent systems. The sediment activity at Manival, an active torrent system of the northern French Alps, was investigated using terrestrial laser scanning and supplemented with geostructural investigations and a survey of sediment transferred in the main torrent. A full year of sediment flux could be observed, which coincided with two debris flows and several bedload transport events. This study revealed that both debris flows generated in the torrent and were preceded in time by recharge of material from the headwaters. Debris production occurred mostly during winter - early spring time and was caused by large slope failures. Sediment transfers were more puzzling, occurring almost exclusively in early spring subordinated to runoffconditions and in autumn during long rainfall. Intense rainstorms in summer did not affect debris storage that seems to rely on the stability of debris deposits. The morpho-geological implication in debris supply was evaluated using DEM and field surveys. A slope angle-based classification of topography could characterize the mode of debris production and transfer. A slope stability analysis derived from the structures in rock mass could assess susceptibility to failure. The modeled rockfall source areas included more than 97% of the recorded events and the sediment budgets appeared to be correlated to the density of potential slope failure. This work showed that the analysis of process-related terrain morphology and of susceptibility to slope failure document the sediment dynamics to quantitatively assess erosion zones leading to debris flow activity. The development of erosional landforms was evaluated by analyzing their geometry with the orientations of potential rock slope failure and with the direction of the maximum joint frequency. Structure in rock mass, but in particular wedge failure and the dominant discontinuities, appear as a first-order control of erosional mechanisms affecting bedrock- dominated catchment. They represent some weaknesses that are exploited primarily by mass wasting processes and erosion, promoting not only the initiation of rock couloirs and gullies, but also their propagation. Incorporating the geological control in geomorphic processes contributes to better understand the landscape evolution of active catchments. A sediment flux algorithm was implemented in a sediment cascade model that discretizes the torrent catchment in channel reaches and individual process-response systems. Each conceptual element includes in simple manner geomorphological and sediment flux information derived from GIS complemented with field mapping. This tool enables to simulate sediment transfers in channels considering evolving debris supply and conveyance, and helps reducing the uncertainty inherent to sediment budget prediction in torrent systems. Cette thèse est un recueil de projets d'études des processus de recharges sédimentaires des chenaux torrentiels. Ces travaux, réalisés lorsque j'étais employé à l'Université de Lausanne, se concentrent sur les implications géologiques et morphologiques des bassins dans l'apport de sédiments, élément fondamental dans la prédiction de laves torrentielles. D'autres aspects de dynamique sédimentaire ont été abordés, p. ex. le couplage torrent - bassin, ainsi qu'un modèle de simulation du transfert sédimentaire en milieu torrentiel. L'activité sédimentaire du Manival, un système torrentiel actif des Alpes françaises, a été étudiée par relevés au laser scanner terrestre et complétée par une étude géostructurale ainsi qu'un suivi du transfert en sédiments du torrent. Une année de flux sédimentaire a pu être observée, coïncidant avec deux laves torrentielles et plusieurs phénomènes de charriages. Cette étude a révélé que les laves s'étaient générées dans le torrent et étaient précédées par une recharge de débris depuis les versants. La production de débris s'est passée principalement en l'hiver - début du printemps, causée par de grandes ruptures de pentes. Le transfert était plus étrange, se produisant presque exclusivement au début du printemps subordonné aux conditions d'écoulement et en automne lors de longues pluies. Les orages d'été n'affectèrent guère les dépôts, qui semblent dépendre de leur stabilité. Les implications morpho-géologiques dans l'apport sédimentaire ont été évaluées à l'aide de MNT et études de terrain. Une classification de la topographie basée sur la pente a permis de charactériser le mode de production et transfert. Une analyse de stabilité de pente à partir des structures de roches a permis d'estimer la susceptibilité à la rupture. Les zones sources modélisées comprennent plus de 97% des chutes de blocs observées et les bilans sédimentaires sont corrélés à la densité de ruptures potentielles. Ce travail d'analyses des morphologies du terrain et de susceptibilité à la rupture documente la dynamique sédimentaire pour l'estimation quantitative des zones érosives induisant l'activité torrentielle. Le développement des formes d'érosion a été évalué par l'analyse de leur géométrie avec celle des ruptures potentielles et avec la direction de la fréquence maximale des joints. Les structures de roches, mais en particulier les dièdres et les discontinuités dominantes, semblent être très influents dans les mécanismes d'érosion affectant les bassins rocheux. Ils représentent des zones de faiblesse exploitées en priorité par les processus de démantèlement et d'érosion, encourageant l'initiation de ravines et couloirs, mais aussi leur propagation. L'incorporation du control géologique dans les processus de surface contribue à une meilleure compréhension de l'évolution topographique de bassins actifs. Un algorithme de flux sédimentaire a été implémenté dans un modèle en cascade, lequel divise le bassin en biefs et en systèmes individuels répondant aux processus. Chaque unité inclut de façon simple les informations géomorpologiques et celles du flux sédimentaire dérivées à partir de SIG et de cartographie de terrain. Cet outil permet la simulation des transferts de masse dans les chenaux, considérants la variabilité de l'apport et son transport, et aide à réduire l'incertitude liée à la prédiction de bilans sédimentaires torrentiels. Ce travail vise très humblement d'éclairer quelques aspects de la dynamique sédimentaire en milieu torrentiel.
Resumo:
Production flow analysis (PFA) is a well-established methodology used for transforming traditional functional layout into product-oriented layout. The method uses part routings to find natural clusters of workstations forming production cells able to complete parts and components swiftly with simplified material flow. Once implemented, the scheduling system is based on period batch control aiming to establish fixed planning, production and delivery cycles for the whole production unit. PFA is traditionally applied to job-shops with functional layouts, and after reorganization within groups lead times reduce, quality improves and motivation among personnel improves. Several papers have documented this, yet no research has studied its application to service operations management. This paper aims to show that PFA can well be applied not only to job-shop and assembly operations, but also to back-office and service processes with real cases. The cases clearly show that PFA reduces non-value adding operations, introduces flow by evening out bottlenecks and diminishes process variability, all of which contribute to efficient operations management.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/
Resumo:
BACKGROUND: Fever upon return from tropical or subtropical regions can be caused by diseases that are rapidly fatal if left untreated. The differential diagnosis is wide. Physicians often lack the necessary knowledge to appropriately take care of such patients. OBJECTIVE: To develop practice guidelines for the initial evaluation of patients presenting with fever upon return from a tropical or subtropical country in order to reduce delays and potential fatal outcomes and to improve knowledge of physicians. TARGET AUDIENCE: Medical personnel, usually physicians, who see the returning patients, primarily in an ambulatory setting or in an emergency department of a hospital and specialists in internal medicine, infectious diseases, and travel medicine. METHOD: A systematic review of the literature--mainly extracted from the National Library of Medicine database--was performed between May 2000 and April 2001, using the keywords fever and/or travel and/or migrant and/or guidelines. Eventually, 250 articles were reviewed. The relevant elements of evidence were used in combination with expert knowledge to construct an algorithm with arborescence flagging the level of specialization required to deal with each situation. The proposed diagnoses and treatment plans are restricted to tropical or subtropical diseases (nonautochthonous diseases). The decision chart is accompanied with a detailed document that provides for each level of the tree the degree of evidence and the grade of recommendation as well as the key points of debate. PARTICIPANTS AND CONSENSUS PROCESS: Besides the 4 authors (2 specialists in travel/tropical medicine, 1 clinical epidemiologist, and 1 resident physician), a panel of 11 European physicians with different levels of expertise on travel medicine reviewed the guidelines. Thereafter, each point of the proposed recommendations was discussed with 15 experts in travel/tropical medicine from various continents. A final version was produced and submitted for evaluation to all participants. CONCLUSION: Although the quality of evidence was limited by the paucity of clinical studies, these guidelines established with the support of a large and highly experienced panel should help physicians to deal with patients coming back from the Tropics with fever.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Correlative fluorescence and electron microscopy has become an indispensible tool for research in cell biology. The integrated Laser and Electron Microscope (iLEM) combines a Fluorescence Microscope (FM) and a Transmission Electron Microscope (TEM) within one set-up. This unique imaging tool allows for rapid identification of a region of interest with the FM, and subsequent high resolution TEM imaging of this area. Sample preparation is one of the major challenges in correlative microscopy of a single specimen; it needs to be apt for both FM and TEM imaging. For iLEM, the performance of the fluorescent probe should not be impaired by the vacuum of the TEM. In this technical note, we have compared the fluorescence intensity of six fluorescent probes in a dry, oxygen free environment relative to their performance in water. We demonstrate that the intensity of some fluorophores is strongly influenced by its surroundings, which should be taken into account in the design of the experiment. Furthermore, a freeze-substitution and Lowicryl resin embedding protocol is described that yields excellent membrane contrast in the TEM but prevents quenching of the fluorescent immuno-labeling. The embedding protocol results in a single specimen preparation procedure that performs well in both FM and TEM. Such procedures are not only essential for the iLEM, but also of great value to other correlative microscopy approaches.
Resumo:
Activation dynamics of hippocampal subregions during spatial learning and their interplay with neocortical regions is an important dimension in the understanding of hippocampal function. Using the (14C)-2-deoxyglucose autoradiographic method, we have characterized the metabolic changes occurring in hippocampal subregions in mice while learning an eight-arm radial maze task. Autoradiogram densitometry revealed a heterogeneous and evolving pattern of enhanced metabolic activity throughout the hippocampus during the training period and on recall. In the early stages of training, activity was enhanced in the CA1 area from the intermediate portion to the posterior end as well as in the CA3 area within the intermediate portion of the hippocampus. At later stages, CA1 and CA3 activations spread over the entire longitudinal axis, while dentate gyrus (DG) activation occurred from the anterior to the intermediate zone. Activation of the retrosplenial cortex but not the amygdala was also observed during the learning process. On recall, only DG activation was observed in the same anterior part of the hippocampus. These results suggest the existence of a functional segmentation of the hippocampus, each subregion being dynamically but also differentially recruited along the acquisition, consolidation, and retrieval process in parallel with some neocortical sites.
Resumo:
Résumé Cet article examine le rôle joué par les normes internationales techniques dans la mondialisation des activités de service. Différentes approches d'économie considèrent que les spécificités des activités de services sont un frein à leur délocalisation, à leur industrialisation et à leur normalisation. A l'opposé de ces approches centrées sur les spécificités des activités de services, les approches d'économie politique internationale mettent en avant l'existence de configurations conflictuelles de pouvoir à l'oeuvre dans l'internationalisation des activités de services et ce, au-delà des limites sectorielles et nationales. Cet article examine le cas du secteur des centres d'appels et, plus généralement, celui de la sous-traitance des services aux entreprises (BPO) en Inde. Nos résultats suggèrent que les normes techniques sont importantes dans le secteur étudié, alors même que ces types de services sont conventionnellement identifiés comme étant peu susceptibles d'être soumis à des normes. Une perspective d'économie politique sur la normalisation des activités de service souligne comment la problématique du pouvoir investit la normalisation technique d'une dimension plus progressive à travers les thématiques du "travailleur", du "consommateur", ou de "l'environnement". Abstract This paper explores the role of international standards in the much-debated globalisation of the service economy. Various strands of economic analyses consider that core attributes of services affect their ability to be reliably delocalised, industrialised, and standardised. In contrast, international political economy approaches draw attention to power configurations supporting conflicting use of standards across industries and nations. The paper examines the case of the rising Indian service industry in customer centres and business process outsourcing to probe these opposing views. Our findings suggest that standards matter in types of services that conventional economic analyses identify as unlikely to be standardised, and that the standards used in the Indian BPO industry are widely accepted. Despite little conflict in actual definitions of market requirements, an international political economy perspective on service standardisation highlights the importance of potential power issues related to workers', consumers', and environmental concerns likely to be included in more progressive forms of standardisation.
Resumo:
The implementation of new techniques of imaging in the daily practice of the radiation oncologist is a major advance in these last 10 years. This allows optimizing the therapeutic intervals and locoregional control of the disease while limiting side effects. Among them, positron emission tomography (PET) offers an opportunity to the clinician to obtain data relative to the tumoral biological mechanisms, while benefiting from the morphological images of the computed tomography (CT) scan. Recently hybrid PET/CT has been developed and numerous studies aimed at optimizing its use in the planning, the evaluation of the treatment response and the prognostic value. The choice of the radiotracer (according to the type of cancer and to the studied biological mechanism) and the various methods of tumoral delineation, require a regular update to optimize the practices. We propose throughout this article, an exhaustive review of the published researches (and in process of publication) until December 2011, as user guide of PET/CT in all the aspects of the modern radiotherapy (from the diagnosis to the follow-up): biopsy guiding, optimization of treatment planning and dosimetry, evaluation of tumor response and prognostic value, follow-up and early detection of recurrence versus tumoral necrosis. In a didactic purpose, each of these aspects is approached by primary tumoral location, and illustrated with representative iconographic examples. The current contribution of PET/CT and its perspectives of development are described to offer to the radiation oncologist a clear and up to date reading in this expanding domain.
Resumo:
The linking of North and South America by the Isthmus of Panama had major impacts on global climate, oceanic and atmospheric currents, and biodiversity, yet the timing of this critical event remains contentious. The Isthmus is traditionally understood to have fully closed by ca. 3.5 million years ago (Ma), and this date has been used as a benchmark for oceanographic, climatic, and evolutionary research, but recent evidence suggests a more complex geological formation. Here, we analyze both molecular and fossil data to evaluate the tempo of biotic exchange across the Americas in light of geological evidence. We demonstrate significant waves of dispersal of terrestrial organisms at approximately ca. 20 and 6 Ma and corresponding events separating marine organisms in the Atlantic and Pacific oceans at ca. 23 and 7 Ma. The direction of dispersal and their rates were symmetrical until the last ca. 6 Ma, when northern migration of South American lineages increased significantly. Variability among taxa in their timing of dispersal or vicariance across the Isthmus is not explained by the ecological factors tested in these analyses, including biome type, dispersal ability, and elevation preference. Migration was therefore not generally regulated by intrinsic traits but more likely reflects the presence of emergent terrain several millions of years earlier than commonly assumed. These results indicate that the dramatic biotic turnover associated with the Great American Biotic Interchange was a long and complex process that began as early as the Oligocene-Miocene transition.
Resumo:
Successful generation of high producing cell lines requires the generation of cell clones expressing the recombinant protein at high levels and the characterization of the clones' ability to maintain stable expression levels. The use of cis-acting epigenetic regulatory elements that improve this otherwise long and uncertain process has revolutionized recombinant protein production. Here we review and discuss new insights into the molecular mode of action of the matrix attachment regions (MARs) and ubiquitously-acting chromatin opening elements (UCOEs), i.e. cis-acting elements, and how these elements are being used to improve recombinant protein production. These elements can help maintain the chromatin environment of the transgene genomic integration locus in a transcriptionally favorable state, which increases the numbers of positive clones and the transgene expression levels. Moreover, the high producing clones tend to be more stable in long-term cultures even in the absence of selection pressure. Therefore, by increasing the probability of isolating a high producing clone, as well as by increasing transcription efficiency and stability, these elements can significantly reduce the time and cost required for producing large quantities of recombinant proteins.