8 resultados para Submarine Pipelines
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
This paper is the maritime and sub–Antarctic contribution to the Scientific Committee for Antarctic Research (SCAR) Past Antarctic Ice Sheet Dynamics (PAIS) community Antarctic Ice Sheet reconstruction. The overarching aim for all sectors of Antarctica was to reconstruct the Last Glacial Maximum (LGM) ice sheet extent and thickness, and map the subsequent deglaciation in a series of 5000 year time slices. However, our review of the literature found surprisingly few high quality chronological constraints on changing glacier extents on these timescales in the maritime and sub–Antarctic sector. Therefore, in this paper we focus on an assessment of the terrestrial and offshore evidence for the LGM ice extent, establishing minimum ages for the onset of deglaciation, and separating evidence of deglaciation from LGM limits from those associated with later Holocene glacier fluctuations. Evidence included geomorphological descriptions of glacial landscapes, radiocarbon dated basal peat and lake sediment deposits, cosmogenic isotope ages of glacial features and molecular biological data. We propose a classification of the glacial history of the maritime and sub–Antarctic islands based on this assembled evidence. These include: (Type I) islands which accumulated little or no LGM ice; (Type II) islands with a limited LGM ice extent but evidence of extensive earlier continental shelf glaciations; (Type III) seamounts and volcanoes unlikely to have accumulated significant LGM ice cover; (Type IV) islands on shallow shelves with both terrestrial and submarine evidence of LGM (and/or earlier) ice expansion; (Type V) Islands north of the Antarctic Polar Front with terrestrial evidence of LGM ice expansion; and (Type VI) islands with no data. Finally, we review the climatological and geomorphological settings that separate the glaciological history of the islands within this classification scheme.
Resumo:
Starting of from Avner Offer’s comment that the First World War was not only a war of steel and gold, but also of bread and potatoes (1989: 1) and my own research on British as well as Australian preparations for economic warfare and based on sources from the entente as well as the central powers but also from the United States, Canada and Australia, may presentation will focus on the interdependence of the measures taken by entente as well as central power authorities in the second half of 1916. Already a year before both sides had become aware that this war would not only be decided on the battlefield, but that the issues of primary as well as secondary resources would be decisive. Accordingly measures that could strike the enemy in this field were discussed and put into place more and more and this at time, when weather conditions caused a reduction of harvest all over Europe, Northern America and Argentina.
Resumo:
An unusual case is presented of a tourist who developed fatal cerebral air embolism, pneumomediastinum and pneumopericardium while ascending from low altitude to Europe's highest railway station. Presumably the air embolism originated from rupture of the unsuspected bronchogenic cyst as a result of pressure changes during the ascent. Cerebral air embolism has been observed during surgery, in scuba diving accidents, submarine escapes and less frequently during exposure to very high altitude. People with known bronchogenic cysts should be informed about the risk of cerebral air embolism and surgical removal should be considered. Cerebral air embolism is a rare cause of coma and stroke in all activities with rapid air pressure changes, including alpine tourism, as our unfortunate tourist illustrates.
Resumo:
bstract With its smaller size, well-known boundary conditions, and the availability of detailed bathymetric data, Lake Geneva’s subaquatic canyon in the Rhone Delta is an excellent analogue to understand sedimentary pro- cesses in deep-water submarine channels. A multidisciplinary research effort was undertaken to unravel the sediment dynamics in the active canyon. This approach included innovative coring using the Russian MIR sub- mersibles, in situ geotechnical tests, and geophysical, sedimentological, geochemical and radiometric analysis techniques. The canyon floor/levee complex is character- ized by a classic turbiditic system with frequent spillover events. Sedimentary evolution in the active canyon is controlled by a complex interplay between erosion and sedimentation processes. In situ profiling of sediment strength in the upper layer was tested using a dynamic penetrometer and suggests that erosion is the governing mechanism in the proximal canyon floor while sedimen- tation dominates in the levee structure. Sedimentation rates progressively decrease down-channel along the levee structure, with accumulation exceeding 2.6 cm/year in the proximal levee. A decrease in the frequency of turbidites upwards along the canyon wall suggests a progressive confinement of the flow through time. The multi-proxy methodology has also enabled a qualitative slope-stability assessment in the levee structure. The rapid sediment loading, slope undercutting and over-steepening, and increased pore pressure due to high methane concentrations hint at a potential instability of the proximal levees. Fur- thermore, discrete sandy intervals show very high methane concentrations and low shear strength and thus could cor- respond to potentially weak layers prone to scarp failures.
Resumo:
Code clone detection helps connect developers across projects, if we do it on a large scale. The cornerstones that allow clone detection to work on a large scale are: (1) bad hashing (2) lightweight parsing using regular expressions and (3) MapReduce pipelines. Bad hashing means to determine whether or not two artifacts are similar by checking whether their hashes are identical. We show a bad hashing scheme that works well on source code. Lightweight parsing using regular expressions is our technique of obtaining entire parse trees from regular expressions, robustly and efficiently. We detail the algorithm and implementation of one such regular expression engine. MapReduce pipelines are a way of expressing a computation such that it can automatically and simply be parallelized. We detail the design and implementation of one such MapReduce pipeline that is efficient and debuggable. We show a clone detector that combines these cornerstones to detect code clones across all projects, across all versions of each project.
Resumo:
Complete transcriptomic data at high resolution are available only for a few model organisms with medical importance. The gene structures of non-model organisms are mostly computationally predicted based on comparative genomics with other species. As a result, more than half of the horse gene models are known only by projection. Experimental data supporting these gene models are scarce. Moreover, most of the annotated equine genes are single-transcript genes. Utilizing RNA sequencing (RNA-seq) the experimental validation of predicted transcriptomes has become accessible at reasonable costs. To improve the horse genome annotation we performed RNA-seq on 561 samples of peripheral blood mononuclear cells (PBMCs) derived from 85 Warmblood horses. The mapped sequencing reads were used to build a new transcriptome assembly. The new assembly revealed many alternative isoforms associated to known genes or to those predicted by the Ensembl and/or Gnomon pipelines. We also identified 7,531 transcripts not associated with any horse gene annotated in public databases. Of these, 3,280 transcripts did not have a homologous match to any sequence deposited in the NCBI EST database suggesting horse specificity. The unknown transcripts were categorized as coding and noncoding based on predicted coding potential scores. Among them 230 transcripts had high coding potential score, at least 2 exons, and an open reading frame of at least 300 nt. We experimentally validated 9 new equine coding transcripts using RT-PCR and Sanger sequencing. Our results provide valuable detailed information on many transcripts yet to be annotated in the horse genome.
Resumo:
With the ongoing shift in the computer graphics industry toward Monte Carlo rendering, there is a need for effective, practical noise-reduction techniques that are applicable to a wide range of rendering effects and easily integrated into existing production pipelines. This course surveys recent advances in image-space adaptive sampling and reconstruction algorithms for noise reduction, which have proven very effective at reducing the computational cost of Monte Carlo techniques in practice. These approaches leverage advanced image-filtering techniques with statistical methods for error estimation. They are attractive because they can be integrated easily into conventional Monte Carlo rendering frameworks, they are applicable to most rendering effects, and their computational overhead is modest.