956 resultados para platforms


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider an economy in which agents are embedded in a network of potential value-generating relationships. Agents are assumed to be able to participate in three types of economic interactions: Autarkic self-provision; bilateral interaction; and multilateral collaboration through endogenously provided platforms.
We introduce two stability concepts and provide sufficient and necessary conditions on the network structure that guarantee existence, in cases of the absence of externalities, link-based externalities and crowding externalities. We show that institutional arrangements based on socioeconomic roles and leadership guarantee stability. In particular, the stability of more complex economic outcomes requires more strict and complex institutional rules to govern economic interactions. We investigate strict social hierarchies, tiered leadership structures and global market places.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing adoption of cloud computing, social networking, mobile and big data technologies provide challenges and opportunities for both research and practice. Researchers face a deluge of data generated by social network platforms which is further exacerbated by the co-mingling of social network platforms and the emerging Internet of Everything. While the topicality of big data and social media increases, there is a lack of conceptual tools in the literature to help researchers approach, structure and codify knowledge from social media big data in diverse subject matter domains, many of whom are from nontechnical disciplines. Researchers do not have a general-purpose scaffold to make sense of the data and the complex web of relationships between entities, social networks, social platforms and other third party databases, systems and objects. This is further complicated when spatio-temporal data is introduced. Based on practical experience of working with social media datasets and existing literature, we propose a general research framework for social media research using big data. Such a framework assists researchers in placing their contributions in an overall context, focusing their research efforts and building the body of knowledge in a given discipline area using social media data in a consistent and coherent manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The endosomal system provides a route whereby nutrients, viruses, and receptors are internalized. During the course of endocytosis, activated receptors can accumulate within endosomal structures and certain signal-transducing molecules can be recruited to endosomal membranes. In the context of signaling and cancer, they provide platforms within the cell from which signals can be potentiated or attenuated. Regulation of the duration of receptor signaling is a pivotal means of refining growth responses in cells. In cancers, this is often considered in terms of mutations that affect receptor tyrosine kinases and maintain them in hyperactivated states of dimerization and/or phosphorylation. However, disruption to the regulatory control exerted by the assembly of protein complexes within the endosomal network can also contribute to disease among which oncogenesis is characterized in part by dysregulated growth, enhanced cell survival, and changes in the expression of markers of differentiation. In this chapter, we will discuss the role of proteins that regulate in endocytosis as tumor suppressors or oncogenes and how changing the fate of internalized receptors and concomitant endosomal signaling can contribute to cancer.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chromatin immunoprecipitation (ChIP) allows enrichment of genomic regions which are associated with specific transcription factors, histone modifications, and indeed any other epitopes which are present on chromatin. The original ChIP methods used site-specific PCR and Southern blotting to confirm which regions of the genome were enriched, on a candidate basis. The combination of ChIP with genomic tiling arrays (ChIP-chip) allowed a more unbiased approach to map ChIP-enriched sites. However, limitations of microarray probe design and probe number have a detrimental impact on the coverage, resolution, sensitivity, and cost of whole-genome tiling microarray sets for higher eukaryotes with large genomes. The combination of ChIP with high-throughput sequencing technology has allowed more comprehensive surveys of genome occupancy, greater resolution, and lower cost for whole genome coverage. Herein, we provide a comparison of high-throughput sequencing platforms and a survey of ChIP-seq analysis tools, discuss experimental design, and describe a detailed ChIP-seq method.Chromatin immunoprecipitation (ChIP) allows enrichment of genomic regions which are associated with specific transcription factors, histone modifications, and indeed any other epitopes which are present on chromatin. The original ChIP methods used site-specific PCR and Southern blotting to confirm which regions of the genome were enriched, on a candidate basis. The combination of ChIP with genomic tiling arrays (ChIP-chip) allowed a more unbiased approach to map ChIP-enriched sites. However, limitations of microarray probe design and probe number have a detrimental impact on the coverage, resolution, sensitivity, and cost of whole-genome tiling microarray sets for higher eukaryotes with large genomes. The combination of ChIP with high-throughput sequencing technology has allowed more comprehensive surveys of genome occupancy, greater resolution, and lower cost for whole genome coverage. Herein, we provide a comparison of high-throughput sequencing platforms and a survey of ChIP-seq analysis tools, discuss experimental design, and describe a detailed ChIP-seq method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Currently, two main technologies are used for screening of DNA copy number; the BAC (Bacterial Artificial Chromosome) and the recently developed oligonucleotide-based CGH (Chromosomal Comparative Genomic Hybridization) arrays which are capable of detecting small genomic regions with amplification or deletion. The correlation as well as the discriminative power of these platforms has never been compared statistically on a significant set of human patient samples.

RESULTS: In this paper, we present an exhaustive comparison between the two CGH platforms, undertaken at two independent sites using the same batch of DNA from 19 advanced prostate cancers. The comparison was performed directly on the raw data and a significant correlation was found between the two platforms. The correlation was greatly improved when the data were averaged over large chromosomic regions using a segmentation algorithm. In addition, this analysis has enabled the development of a statistical model to discriminate BAC outliers that might indicate microevents. These microevents were validated by the oligo platform results.

CONCLUSION: This article presents a genome-wide statistical validation of the oligo array platform on a large set of patient samples and demonstrates statistically its superiority over the BAC platform for the Identification of chromosomic events. Taking advantage of a large set of human samples treated by the two technologies, a statistical model has been developed to show that the BAC platform could also detect microevents.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

WHIRLBOB, also known as STRIBOBr2, is an AEAD (Authenticated Encryption with Associated Data) algorithm derived from STRIBOBr1 and the Whirlpool hash algorithm. WHIRLBOB/STRIBOBr2 is a second round candidate in the CAESAR competition. As with STRIBOBr1, the reduced-size Sponge design has a strong provable security link with a standardized hash algorithm. The new design utilizes only the LPS or ρ component of Whirlpool in flexibly domain-separated BLNK Sponge mode. The number of rounds is increased from 10 to 12 as a countermeasure against Rebound Distinguishing attacks. The 8 ×8 - bit S-Box used by Whirlpool and WHIRLBOB is constructed from 4 ×4 - bit “MiniBoxes”. We report on fast constant-time Intel SSSE3 and ARM NEON SIMD WHIRLBOB implementations that keep full miniboxes in registers and access them via SIMD shuffles. This is an efficient countermeasure against AES-style cache timing side-channel attacks. Another main advantage of WHIRLBOB over STRIBOBr1 (and most other AEADs) is its greatly reduced implementation footprint on lightweight platforms. On many lower-end microcontrollers the total software footprint of π+BLNK = WHIRLBOB AEAD is less than half a kilobyte. We also report an FPGA implementation that requires 4,946 logic units for a single round of WHIRLBOB, which compares favorably to 7,972 required for Keccak / Keyak on the same target platform. The relatively small S-Box gate count also enables efficient 64-bit bitsliced straight-line implementations. We finally present some discussion and analysis on the relationships between WHIRLBOB, Whirlpool, the Russian GOST Streebog hash, and the recent draft Russian Encryption Standard Kuznyechik.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document describes the cryptographic hash function BLAKE2 and makes the algorithm specification and C source code conveniently available to the Internet community. BLAKE2 comes in two main flavors: BLAKE2b is optimized for 64-bit platforms and BLAKE2s for smaller architectures. BLAKE2 can be directly keyed, making it functionally equivalent to a Message Authentication Code (MAC).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The growing accessibility to genomic resources using next-generation sequencing (NGS) technologies has revolutionized the application of molecular genetic tools to ecology and evolutionary studies in non-model organisms. Here we present the case study of the European hake (Merluccius merluccius), one of the most important demersal resources of European fisheries. Two sequencing platforms, the Roche 454 FLX (454) and the Illumina Genome Analyzer (GAII), were used for Single Nucleotide Polymorphisms (SNPs) discovery in the hake muscle transcriptome. De novo transcriptome assembly into unique contigs, annotation, and in silico SNP detection were carried out in parallel for 454 and GAII sequence data. High-throughput genotyping using the Illumina GoldenGate assay was performed for validating 1,536 putative SNPs. Validation results were analysed to compare the performances of 454 and GAII methods and to evaluate the role of several variables (e.g. sequencing depth, intron-exon structure, sequence quality and annotation). Despite well-known differences in sequence length and throughput, the two approaches showed similar assay conversion rates (approximately 43%) and percentages of polymorphic loci (67.5% and 63.3% for GAII and 454, respectively). Both NGS platforms therefore demonstrated to be suitable for large scale identification of SNPs in transcribed regions of non-model species, although the lack of a reference genome profoundly affects the genotyping success rate. The overall efficiency, however, can be improved using strict quality and filtering criteria for SNP selection (sequence quality, intron-exon structure, target region score).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.

In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The risks associated with zoonotic infections transmitted by companion animals are a serious public health concern: the control of zoonoses incidence in domestic dogs, both owned and stray, is hence important to protect human health. Integrated dog population management (DPM) programs, based on the availability of information systems providing reliable data on the structure and composition of the existing dog population in a given area, are fundamental for making realistic plans for any disease surveillance and action system. Traceability systems, based on the compulsory electronic identification of dogs and their registration in a computerised database, are one of the most effective ways to ensure the usefulness of DPM programs. Even if this approach provides many advantages, several areas of improvement have emerged in countries where it has been applied. In Italy, every region hosts its own dog register but these are not compatible with one another. This paper shows the advantages of a web-based-application to improve data management of dog regional registers. The approach used for building this system was inspired by farm animal traceability schemes and it relies on a network of services that allows multi-channel access by different devices and data exchange via the web with other existing applications, without changing the pre-existing platforms. Today the system manages a database for over 300,000 dogs registered in three different Italian regions. By integrating multiple Web Services, this approach could be the solution to gather data at national and international levels at reasonable cost and creating a traceability system on a large scale and across borders that can be used for disease surveillance and development of population management plans. © 2012 Elsevier B.V.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The recent explosion of genetic and clinical data generated from tumor genome analysis presents an unparalleled opportunity to enhance our understanding of cancer, but this opportunity is compromised by the reluctance of many in the scientific community to share datasets and the lack of interoperability between different data platforms. The Global Alliance for Genomics and Health is addressing these barriers and challenges through a cooperative framework that encourages "team science" and responsible data sharing, complemented by the development of a series of application program interfaces that link different data platforms, thus breaking down traditional silos and liberating the data to enable new discoveries and ultimately benefit patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how US imperial strategy is sustained by tourism and militarism requires an account of how American soldiers learn to understand themselves in relation to a variety of marginalized others. This paper explores how the US Army’s ‘Ready and Resilient’ (R2) campaign constructs soldier / other relations by mobilizing off-duty time through the ‘Better Opportunities for Single Soldiers’ (BOSS) program. BOSS’s first two platforms of ‘Well-Being’ and ‘Community Service’ feed into the R2 agenda by producing highly-skilled leaders (who govern a disengaged rank and file) and benevolent humanitarians (who provide charity for abject civilians). When these dispositions are transposed into BOSS’s third platform of ‘Recreation and Leisure’, soldiers turn away from the goals of leadership and humanitarianism to reveal the privileged narcissism underscoring the R2 agenda. This self-focus is intensified by familiar power relations in the tourism industry as soldiers pursue self-improvement by commodifying, distancing and effacing local tourist workers. Using the BOSS program as a case study, this paper critically interrogates how the US Army is assimilating off-duty practices of tourism, leisure and recreation into the wider program of resilience training.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Transdermal drug delivery is themovement of drugs across the skin for absorption into the systemic circulation. Transfer of the drug can occur via passive or active means; passive trans- dermal products donot disrupt the stratumcorneumto facilitate deliverywhereas active technologies do. Due to the very specific physicochemical properties necessary for successful passive transdermal drug delivery, this sector of the pharmaceutical industry is relatively small. There are many well-documented benefits of this delivery route however, and as a result there is great interest in increasing the number of therapeutic substances that can be delivered transdermally. Areas Covered: This review discusses the various transdermal products that are currently/have been marketed, and the paths that led to their success, or lack of. Both passive and active transdermal technologies are considered with the advantages and limitations of each high- lighted. In addition to marketed products, technologies that are in the investigative stages by various pharmaceutical companies are reviewed. Expert Opinion: Passive transdermal drug delivery has made limited progress in recent years, however with the ongoing intense research into active technologies, there is great potential for growth within the transdermal delivery market. A number of active technologies have already been translated into marketed products, with other platforms including microneedles, rapidly progressing towards commercialisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exascale computation is the next target of high performance computing. In the push to create exascale computing platforms, simply increasing the number of hardware devices is not an acceptable option given the limitations of power consumption, heat dissipation, and programming models which are designed for current hardware platforms. Instead, new hardware technologies, coupled with improved programming abstractions and more autonomous runtime systems, are required to achieve this goal. This position paper presents the design of a new runtime for a new heterogeneous hardware platform being developed to explore energy efficient, high performance computing. By combining a number of different technologies, this framework will both simplify the programming of current and future HPC applications, as well as automating the scheduling of data and computation across this new hardware platform. In particular, this work explores the use of FPGAs to achieve both the power and performance goals of exascale, as well as utilising the runtime to automatically effect dynamic configuration and reconfiguration of these platforms

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an approach to COLREGs compliant ship navigation. A system architecture is proposed, which will be implemented and tested on two platforms: networked bridge simulators and at sea trials using an autonomous unmanned surface vessel. Attention is paid to collision avoidance software and its risk mitigation.