948 resultados para tribunals (platforms)


Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Currently, two main technologies are used for screening of DNA copy number; the BAC (Bacterial Artificial Chromosome) and the recently developed oligonucleotide-based CGH (Chromosomal Comparative Genomic Hybridization) arrays which are capable of detecting small genomic regions with amplification or deletion. The correlation as well as the discriminative power of these platforms has never been compared statistically on a significant set of human patient samples.

RESULTS: In this paper, we present an exhaustive comparison between the two CGH platforms, undertaken at two independent sites using the same batch of DNA from 19 advanced prostate cancers. The comparison was performed directly on the raw data and a significant correlation was found between the two platforms. The correlation was greatly improved when the data were averaged over large chromosomic regions using a segmentation algorithm. In addition, this analysis has enabled the development of a statistical model to discriminate BAC outliers that might indicate microevents. These microevents were validated by the oligo platform results.

CONCLUSION: This article presents a genome-wide statistical validation of the oligo array platform on a large set of patient samples and demonstrates statistically its superiority over the BAC platform for the Identification of chromosomic events. Taking advantage of a large set of human samples treated by the two technologies, a statistical model has been developed to show that the BAC platform could also detect microevents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

WHIRLBOB, also known as STRIBOBr2, is an AEAD (Authenticated Encryption with Associated Data) algorithm derived from STRIBOBr1 and the Whirlpool hash algorithm. WHIRLBOB/STRIBOBr2 is a second round candidate in the CAESAR competition. As with STRIBOBr1, the reduced-size Sponge design has a strong provable security link with a standardized hash algorithm. The new design utilizes only the LPS or ρ component of Whirlpool in flexibly domain-separated BLNK Sponge mode. The number of rounds is increased from 10 to 12 as a countermeasure against Rebound Distinguishing attacks. The 8 ×8 - bit S-Box used by Whirlpool and WHIRLBOB is constructed from 4 ×4 - bit “MiniBoxes”. We report on fast constant-time Intel SSSE3 and ARM NEON SIMD WHIRLBOB implementations that keep full miniboxes in registers and access them via SIMD shuffles. This is an efficient countermeasure against AES-style cache timing side-channel attacks. Another main advantage of WHIRLBOB over STRIBOBr1 (and most other AEADs) is its greatly reduced implementation footprint on lightweight platforms. On many lower-end microcontrollers the total software footprint of π+BLNK = WHIRLBOB AEAD is less than half a kilobyte. We also report an FPGA implementation that requires 4,946 logic units for a single round of WHIRLBOB, which compares favorably to 7,972 required for Keccak / Keyak on the same target platform. The relatively small S-Box gate count also enables efficient 64-bit bitsliced straight-line implementations. We finally present some discussion and analysis on the relationships between WHIRLBOB, Whirlpool, the Russian GOST Streebog hash, and the recent draft Russian Encryption Standard Kuznyechik.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document describes the cryptographic hash function BLAKE2 and makes the algorithm specification and C source code conveniently available to the Internet community. BLAKE2 comes in two main flavors: BLAKE2b is optimized for 64-bit platforms and BLAKE2s for smaller architectures. BLAKE2 can be directly keyed, making it functionally equivalent to a Message Authentication Code (MAC).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The growing accessibility to genomic resources using next-generation sequencing (NGS) technologies has revolutionized the application of molecular genetic tools to ecology and evolutionary studies in non-model organisms. Here we present the case study of the European hake (Merluccius merluccius), one of the most important demersal resources of European fisheries. Two sequencing platforms, the Roche 454 FLX (454) and the Illumina Genome Analyzer (GAII), were used for Single Nucleotide Polymorphisms (SNPs) discovery in the hake muscle transcriptome. De novo transcriptome assembly into unique contigs, annotation, and in silico SNP detection were carried out in parallel for 454 and GAII sequence data. High-throughput genotyping using the Illumina GoldenGate assay was performed for validating 1,536 putative SNPs. Validation results were analysed to compare the performances of 454 and GAII methods and to evaluate the role of several variables (e.g. sequencing depth, intron-exon structure, sequence quality and annotation). Despite well-known differences in sequence length and throughput, the two approaches showed similar assay conversion rates (approximately 43%) and percentages of polymorphic loci (67.5% and 63.3% for GAII and 454, respectively). Both NGS platforms therefore demonstrated to be suitable for large scale identification of SNPs in transcribed regions of non-model species, although the lack of a reference genome profoundly affects the genotyping success rate. The overall efficiency, however, can be improved using strict quality and filtering criteria for SNP selection (sequence quality, intron-exon structure, target region score).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.

In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The risks associated with zoonotic infections transmitted by companion animals are a serious public health concern: the control of zoonoses incidence in domestic dogs, both owned and stray, is hence important to protect human health. Integrated dog population management (DPM) programs, based on the availability of information systems providing reliable data on the structure and composition of the existing dog population in a given area, are fundamental for making realistic plans for any disease surveillance and action system. Traceability systems, based on the compulsory electronic identification of dogs and their registration in a computerised database, are one of the most effective ways to ensure the usefulness of DPM programs. Even if this approach provides many advantages, several areas of improvement have emerged in countries where it has been applied. In Italy, every region hosts its own dog register but these are not compatible with one another. This paper shows the advantages of a web-based-application to improve data management of dog regional registers. The approach used for building this system was inspired by farm animal traceability schemes and it relies on a network of services that allows multi-channel access by different devices and data exchange via the web with other existing applications, without changing the pre-existing platforms. Today the system manages a database for over 300,000 dogs registered in three different Italian regions. By integrating multiple Web Services, this approach could be the solution to gather data at national and international levels at reasonable cost and creating a traceability system on a large scale and across borders that can be used for disease surveillance and development of population management plans. © 2012 Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent explosion of genetic and clinical data generated from tumor genome analysis presents an unparalleled opportunity to enhance our understanding of cancer, but this opportunity is compromised by the reluctance of many in the scientific community to share datasets and the lack of interoperability between different data platforms. The Global Alliance for Genomics and Health is addressing these barriers and challenges through a cooperative framework that encourages "team science" and responsible data sharing, complemented by the development of a series of application program interfaces that link different data platforms, thus breaking down traditional silos and liberating the data to enable new discoveries and ultimately benefit patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how US imperial strategy is sustained by tourism and militarism requires an account of how American soldiers learn to understand themselves in relation to a variety of marginalized others. This paper explores how the US Army’s ‘Ready and Resilient’ (R2) campaign constructs soldier / other relations by mobilizing off-duty time through the ‘Better Opportunities for Single Soldiers’ (BOSS) program. BOSS’s first two platforms of ‘Well-Being’ and ‘Community Service’ feed into the R2 agenda by producing highly-skilled leaders (who govern a disengaged rank and file) and benevolent humanitarians (who provide charity for abject civilians). When these dispositions are transposed into BOSS’s third platform of ‘Recreation and Leisure’, soldiers turn away from the goals of leadership and humanitarianism to reveal the privileged narcissism underscoring the R2 agenda. This self-focus is intensified by familiar power relations in the tourism industry as soldiers pursue self-improvement by commodifying, distancing and effacing local tourist workers. Using the BOSS program as a case study, this paper critically interrogates how the US Army is assimilating off-duty practices of tourism, leisure and recreation into the wider program of resilience training.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Transdermal drug delivery is themovement of drugs across the skin for absorption into the systemic circulation. Transfer of the drug can occur via passive or active means; passive trans- dermal products donot disrupt the stratumcorneumto facilitate deliverywhereas active technologies do. Due to the very specific physicochemical properties necessary for successful passive transdermal drug delivery, this sector of the pharmaceutical industry is relatively small. There are many well-documented benefits of this delivery route however, and as a result there is great interest in increasing the number of therapeutic substances that can be delivered transdermally. Areas Covered: This review discusses the various transdermal products that are currently/have been marketed, and the paths that led to their success, or lack of. Both passive and active transdermal technologies are considered with the advantages and limitations of each high- lighted. In addition to marketed products, technologies that are in the investigative stages by various pharmaceutical companies are reviewed. Expert Opinion: Passive transdermal drug delivery has made limited progress in recent years, however with the ongoing intense research into active technologies, there is great potential for growth within the transdermal delivery market. A number of active technologies have already been translated into marketed products, with other platforms including microneedles, rapidly progressing towards commercialisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exascale computation is the next target of high performance computing. In the push to create exascale computing platforms, simply increasing the number of hardware devices is not an acceptable option given the limitations of power consumption, heat dissipation, and programming models which are designed for current hardware platforms. Instead, new hardware technologies, coupled with improved programming abstractions and more autonomous runtime systems, are required to achieve this goal. This position paper presents the design of a new runtime for a new heterogeneous hardware platform being developed to explore energy efficient, high performance computing. By combining a number of different technologies, this framework will both simplify the programming of current and future HPC applications, as well as automating the scheduling of data and computation across this new hardware platform. In particular, this work explores the use of FPGAs to achieve both the power and performance goals of exascale, as well as utilising the runtime to automatically effect dynamic configuration and reconfiguration of these platforms

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an approach to COLREGs compliant ship navigation. A system architecture is proposed, which will be implemented and tested on two platforms: networked bridge simulators and at sea trials using an autonomous unmanned surface vessel. Attention is paid to collision avoidance software and its risk mitigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern approaches to biomedical research and diagnostics targeted towards precision medicine are generating ‘big data’ across a range of high-throughput experimental and analytical platforms. Integrative analysis of this rich clinical, pathological, molecular and imaging data represents one of the greatest bottlenecks in biomarker discovery research in cancer and other diseases. Following on from the publication of our successful framework for multimodal data amalgamation and integrative analysis, Pathology Integromics in Cancer (PICan), this article will explore the essential elements of assembling an integromics framework from a more detailed perspective. PICan, built around a relational database storing curated multimodal data, is the research tool sitting at the heart of our interdisciplinary efforts to streamline biomarker discovery and validation. While recognizing that every institution has a unique set of priorities and challenges, we will use our experiences with PICan as a case study and starting point, rationalizing the design choices we made within the context of our local infrastructure and specific needs, but also highlighting alternative approaches that may better suit other programmes of research and discovery. Along the way, we stress that integromics is not just a set of tools, but rather a cohesive paradigm for how modern bioinformatics can be enhanced. Successful implementation of an integromics framework is a collaborative team effort that is built with an eye to the future and greatly accelerates the processes of biomarker discovery, validation and translation into clinical practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital image analysis is at a crossroads. While the technology has made great strides over the past few decades, there is an urgent need for image analysis to inform the next wave of large scale tissue biomarker discovery studies in cancer. Drawing parallels from the growth of next generation sequencing, this presentation will consider the case for a common language or standard format for storing and communicating digital image analysis data. In this context, image analysis data comprises more than simply an image with markups and attached key-value pair metrics. The desire to objectively benchmark competing platforms or a push for data to be deposited to public repositories much like genomics data may drive the need for a standard that also encompasses granular, cell-by-cell data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Power capping is a fundamental method for reducing the energy consumption of a wide range of modern computing environments, ranging from mobile embedded systems to datacentres. Unfortunately, maximising performance and system efficiency under static power caps remains challenging, while maximising performance under dynamic power caps has been largely unexplored. We present an adaptive power capping method that reduces the power consumption and maximizes the performance of heterogeneous SoCs for mobile and server platforms. Our technique combines power capping with coordinated DVFS, data partitioning and core allocations on a heterogeneous SoC with ARM processors and FPGA resources. We design our framework as a run-time system based on OpenMP and OpenCL to utilise the heterogeneous resources. We evaluate it through five data-parallel benchmarks on the Xilinx SoC which allows fully voltage and frequency control. Our experiments show a significant performance boost of 30% under dynamic power caps with concurrent execution on ARM and FPGA, compared to a naive separate approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste trabalho apresenta-se um quadro teórico de referência para a avaliação do ensino online, que pretende situar as actividades definidas e desenvolvidas pelos docentes no que respeita às Dimensões da Aprendizagem, da Interacção e da Tecnologia. O referencial teórico que baliza este estudo foi desenvolvido tendo em linha de conta directrizes e orientações nacionais e europeias no que diz respeito ao Ensino Superior (ES), em especial em relação à questão da qualidade do ensino e da integração das tecnologias da informação e da comunicação no currículo. Desta forma, pretendeu-se cumprir um dos objectivos desta investigação, ou seja, contribuir para o desenvolvimento de um quadro conceptual enquadrador do pensamento sobre a avaliação do eLearning no ES e, em particular, das actividades de ensino online. Com base nos pressupostos teóricos definidos, desenvolveram-se dois instrumentos de avaliação que permitem perspectivar as percepções de docentes e de alunos em relação às actividades de ensino desenvolvidas online. No sentido de se definir um público-alvo para testar estes instrumentos, levou-se a cabo um estudo empírico que compreende três fases. A Fase I foi de cariz eminentemente exploratório, uma vez que se inquiriram todos os Estabelecimentos de Ensino Superior (EES) portugueses, com vista a caracterizar o panorama nacional no que releva da utilização de plataformas de eLearning por parte dos EES portugueses, outro objectivo deste trabalho. A Fase II constituiu o momento em que se aplicou pela primeira vez o instrumento de avaliação, apenas dirigido aos docentes, uma vez que universo de respondentes, a considerar-se também os alunos, seria de difícil gestão. Na Fase III seleccionou-se um universo mais reduzido (os quatro cursos de 3.º ciclo ministrados em bLearning em Portugal no ano lectivo de 2009/10), o que permitiu efectivar uma avaliação destes cursos na perspectiva dos docentes e também dos alunos (segundo instrumento de avaliação). Analisados os resultados obtidos nestas três fases, termina-se com a convicção de que se trata de um modelo de avaliação do ensino online passível de ser aplicado ao ES. Isto porque permite não só caracterizar os vários cenários de ensino online através das dimensões de avaliação estabelecidas, como também ter uma percepção bastante clara da forma como as directrizes nacionais e europeias relativas ao ES estão a ser implementadas. Por fim, são feitas propostas para investigação futura, em especial no que se refere à disponibilização online dos instrumentos de avaliação criados e ao desenvolvimento de estudos de investigação-acção que permitam a emergência de eventuais ajustes dos instrumentos e a identificação dos efeitos da sua utilização no aperfeiçoamento das práticas de ensino online.