995 resultados para Parallel key-insulation
Resumo:
A parallel authentication and public-key encryption is introduced and exemplified on joint encryption and signing which compares favorably with sequential Encrypt-then-Sign (ɛtS) or Sign-then-Encrypt (Stɛ) schemes as far as both efficiency and security are concerned. A security model for signcryption, and thus joint encryption and signing, has been recently defined which considers possible attacks and security goals. Such a scheme is considered secure if the encryption part guarantees indistinguishability and the signature part prevents existential forgeries, for outsider but also insider adversaries. We propose two schemes of parallel signcryption, which are efficient alternative to Commit-then-Sign-and- Encrypt (Ct&G3&S). They are both provably secure in the random oracle model. The first one, called generic parallel encrypt and sign, is secure if the encryption scheme is semantically secure against chosen-ciphertext attacks and the signature scheme prevents existential forgeries against random-message attacks. The second scheme, called optimal parallel encrypt. and sign, applies random oracles similar to the OAEP technique in order to achieve security using encryption and signature components with very weak security requirements — encryption is expected to be one-way under chosen-plaintext attacks while signature needs to be secure against universal forgeries under random-plaintext attack, that is actually the case for both the plain-RSA encryption and signature under the usual RSA assumption. Both proposals are generic in the sense that any suitable encryption and signature schemes (i.e. which simply achieve required security) can be used. Furthermore they allow both parallel encryption and signing, as well as parallel decryption and verification. Properties of parallel encrypt and sign schemes are considered and a new security standard for parallel signcryption is proposed.
Resumo:
Specialization to nectarivory is associated with radiations within different bird groups, including parrots. One of them, the Australasian lories, were shown to be unexpectedly species rich. Their shift to nectarivory may have created an ecological opportunity promoting species proliferation. Several morphological specializations of the feeding tract to nectarivory have been described for parrots. However, they have never been assessed in a quantitative framework considering phylogenetic nonindependence. Using a phylogenetic comparative approach with broad taxon sampling and 15 continuous characters of the digestive tract, we demonstrate that nectarivorous parrots differ in several traits from the remaining parrots. These trait-changes indicate phenotype–environment correlations and parallel evolution, and may reflect adaptations to feed effectively on nectar. Moreover, the diet shift was associated with significant trait shifts at the base of the radiation of the lories, as shown by an alternative statistical approach. Their diet shift might be considered as an evolutionary key innovation which promoted significant non-adaptive lineage diversification through allopatric partitioning of the same new niche. The lack of increased rates of cladogenesis in other nectarivorous parrots indicates that evolutionary innovations need not be associated one-to-one with diversification events.
Massively parallel sequencing and analysis of expressed sequence tags in a successful invasive plant
Resumo:
Background Invasive species pose a significant threat to global economies, agriculture and biodiversity. Despite progress towards understanding the ecological factors associated with plant invasions, limited genomic resources have made it difficult to elucidate the evolutionary and genetic factors responsible for invasiveness. This study presents the first expressed sequence tag (EST) collection for Senecio madagascariensis, a globally invasive plant species. Methods We used pyrosequencing of one normalized and two subtractive libraries, derived from one native and one invasive population, to generate an EST collection. ESTs were assembled into contigs, annotated by BLAST comparison with the NCBI non-redundant protein database and assigned gene ontology (GO) terms from the Plant GO Slim ontologies. Key Results Assembly of the 221 746 sequence reads resulted in 12 442 contigs. Over 50 % (6183) of 12 442 contigs showed significant homology to proteins in the NCBI database, representing approx. 4800 independent transcripts. The molecular transducer GO term was significantly over-represented in the native (South African) subtractive library compared with the invasive (Australian) library. Based on NCBI BLAST hits and literature searches, 40 % of the molecular transducer genes identified in the South African subtractive library are likely to be involved in response to biotic stimuli, such as fungal, bacterial and viral pathogens. Conclusions This EST collection is the first representation of the S. madagascariensis transcriptome and provides an important resource for the discovery of candidate genes associated with plant invasiveness. The over-representation of molecular transducer genes associated with defence responses in the native subtractive library provides preliminary support for aspects of the enemy release and evolution of increased competitive ability hypotheses in this successful invasive. This study highlights the contribution of next-generation sequencing to better understanding the molecular mechanisms underlying ecological hypotheses that are important in successful plant invasions.
Resumo:
Experimental and theoretical studies have shown the importance of stochastic processes in genetic regulatory networks and cellular processes. Cellular networks and genetic circuits often involve small numbers of key proteins such as transcriptional factors and signaling proteins. In recent years stochastic models have been used successfully for studying noise in biological pathways, and stochastic modelling of biological systems has become a very important research field in computational biology. One of the challenge problems in this field is the reduction of the huge computing time in stochastic simulations. Based on the system of the mitogen-activated protein kinase cascade that is activated by epidermal growth factor, this work give a parallel implementation by using OpenMP and parallelism across the simulation. Special attention is paid to the independence of the generated random numbers in parallel computing, that is a key criterion for the success of stochastic simulations. Numerical results indicate that parallel computers can be used as an efficient tool for simulating the dynamics of large-scale genetic regulatory networks and cellular processes
Resumo:
This paper offers a definition of elite media arguing their content focus will sufficiently meet social responsibility needs of democracy. Its assumptions come from the Finkelstein and Leveson Inquiries and regulatory British Royal Charter (2013). These provide guidelines on how media outlets meet ‘social responsibility’ standards, e.g. press has a ‘responsibility to be fair and accurate’ (Finkelstein); ethical press will feel a responsibility to ‘hold power to account’ (Leveson); news media ‘will be held strictly accountable’ (RC). The paper invokes the British principle of media opting-in to observe standards, and so serve the democracy. It will give examples from existing media, and consider social responsibility of media more generally. Obvious cases of ‘quality’ media: public broadcasters, e.g. BBC, Al-Jazeera, and ‘quality’ press, e.g. NYT, Süddeutscher Zeitung, but also community broadcasters, specialised magazines, news agencies, distinctive web logs, and others. Where providing commentary, these abjure gratuitous opinion -- meeting a standard of reasoned, informational and fair. Funding is almost a definer, many such services supported by the state, private trusts, public institutions or volunteering by staff. Literature supporting discussion on elite media will include their identity as primarily committed to a public good, e.g. the ‘Public Value Test’, Moe and Donders (2011); with reference also to recent literature on developing public service media. Within its limits the paper will treat social media as participants among all media, including elite, and as a parallel dimension of mass communication founded on inter-activity. Elite media will fulfil the need for social responsibility, firstly by providing one space, a ‘plenary’ for debate. Second is the notion of building public recognition of elite media as trustworthy. Third is the fact that elite media together are a large sector with resources to sustain social cohesion and debate; notwithstanding pressure on funds, and impacts of digital transformation undermining employment in media more than in most industries.
Resumo:
Biological systems are typically complex and adaptive, involving large numbers of entities, or organisms, and many-layered interactions between these. System behaviour evolves over time, and typically benefits from previous experience by retaining memory of previous events. Given the dynamic nature of these phenomena, it is non-trivial to provide a comprehensive description of complex adaptive systems and, in particular, to define the importance and contribution of low-level unsupervised interactions to the overall evolution process. In this chapter, the authors focus on the application of the agent-based paradigm in the context of the immune response to HIV. Explicit implementation of lymph nodes and the associated lymph network, including lymphatic chain structure, is a key objective, and requires parallelisation of the model. Steps taken towards an optimal communication strategy are detailed.
Resumo:
To gain insight into the mechanisms by which the Myb transcription factor controls normal hematopoiesis and particularly, how it contributes to leukemogenesis, we mapped the genome-wide occupancy of Myb by chromatin immunoprecipitation followed by massively parallel sequencing (ChIP-Seq) in ERMYB myeloid progenitor cells. By integrating the genome occupancy data with whole genome expression profiling data, we identified a Myb-regulated transcriptional program. Gene signatures for leukemia stem cells, normal hematopoietic stem/progenitor cells and myeloid development were overrepresented in 2368 Myb regulated genes. Of these, Myb bound directly near or within 793 genes. Myb directly activates some genes known critical in maintaining hematopoietic stem cells, such as Gfi1 and Cited2. Importantly, we also show that, despite being usually considered as a transactivator, Myb also functions to repress approximately half of its direct targets, including several key regulators of myeloid differentiation, such as Sfpi1 (also known as Pu.1), Runx1, Junb and Cebpb. Furthermore, our results demonstrate that interaction with p300, an established coactivator for Myb, is unexpectedly required for Myb-mediated transcriptional repression. We propose that the repression of the above mentioned key pro-differentiation factors may contribute essentially to Myb's ability to suppress differentiation and promote self-renewal, thus maintaining progenitor cells in an undifferentiated state and promoting leukemic transformation. © 2011 The Author(s).
Resumo:
We consider single-source, single-sink multi-hop relay networks, with slow-fading Rayleigh fading links and single-antenna relay nodes operating under the half-duplex constraint. While two hop relay networks have been studied in great detail in terms of the diversity-multiplexing tradeoff (DMT), few results are available for more general networks. In this two-part paper, we identify two families of networks that are multi-hop generalizations of the two hop network: K-Parallel-Path (KPP) networks and Layered networks. In the first part, we initially consider KPP networks, which can be viewed as the union of K node-disjoint parallel paths, each of length > 1. The results are then generalized to KPP(I) networks, which permit interference between paths and to KPP(D) networks, which possess a direct link from source to sink. We characterize the optimal DMT of KPP(D) networks with K >= 4, and KPP(I) networks with K >= 3. Along the way, we derive lower bounds for the DMT of triangular channel matrices, which are useful in DMT computation of various protocols. As a special case, the DMT of two-hop relay network without direct link is obtained. Two key implications of the results in the two-part paper are that the half-duplex constraint does not necessarily entail rate loss by a factor of two, as previously believed and that, simple AF protocols are often sufficient to attain the best possible DMT.
Resumo:
Computational docking of ligands to protein structures is a key step in structure-based drug design. Currently, the time required for each docking run is high and thus limits the use of docking in a high-throughput manner, warranting parallelization of docking algorithms. AutoDock, a widely used tool, has been chosen for parallelization. Near-linear increases in speed were observed with 96 processors, reducing the time required for docking ligands to HIV-protease from 81 min, as an example, on a single IBM Power-5 processor ( 1.65 GHz), to about 1 min on an IBM cluster, with 96 such processors. This implementation would make it feasible to perform virtual ligand screening using AutoDock.
Resumo:
As companies become more efficient with respect to their internal processes, they begin to shift the focus beyond their corporate boundaries. Thus, the recent years have witnessed an increased interest by practitioners and researchers in interorganizational collaboration, which promises better firm performance through more effective supply chain management. It is no coincidence that this interest comes in parallel with the recent advancements in Information and Communication Technologies, which offer many new collaboration possibilities for companies. However, collaboration, or any other type of supply chain integration effort, relies heavily on information sharing. Hence, this study focuses on information sharing, in particular on the factors that determine it and on its value. The empirical evidence from Finnish and Swedish companies suggests that uncertainty (both demand and environmental) and dependency in terms of switching costs and asset specific investments are significant determinants of information sharing. Results also indicate that information sharing improves company performance regarding resource usage, output, and flexibility. However, companies share information more intensely at the operational rather than the strategic level. The use of supply chain practices and technologies is substantial but varies across the two countries. This study sheds light on a common trend in supply chains today. Whereas the results confirm the value of information sharing, the contingent factors help to explain why the intensity of information shared across companies differ. In the future, competitive pressures and uncertainty are likely to intensify. Therefore, companies may want to continue with their integration efforts by focusing on the determinants discussed in this study. However, at the same time, the possibility of opportunistic behavior by the exchange partner cannot be disregarded.
Resumo:
The Pennekamp Coral Reef State Park was established in 1960 and the Key Largo National Marine Sanctuary in 1975. Field studies, funded by NOAA, were conducted in 1980 - 1981 to determine the state of the coral reefs and surrounding areas in relation to changing environmental conditions and resource management that had occurred over the intervening years. Ten reef sites within the Sanctuary and seven shallow grass and hardbottom sites within the Park were chosen for qualitative and quantitative studies. At each site, three parallel transects not less than 400 m long were run perpendicular to the reef or shore, each 300 m apart. Observations, data collecting and sampling were done by two teams of divers. Approximately 75 percent of the bottom within the 18-m isobath was covered by marine grasses, predominantly turtle grass. The general health of the seagrasses appeared good but a few areas showed signs of stress. The inner hardbottom of the Park was studied at the two entrances to Largo Sound. Though at the time of the study the North Channel hardbottom was subjected to only moderate boat traffic, marked changes had taken place over the past years, the most obvious of which was the loss of the extensive beds of Sargassum weed, one of the most extensive beds of this alga in the Keys. Only at this site was the green alga Enteromorpha encountered. This alga, often considered a pollution indicator, may denote the effects of shore run off. The hardbottom at South Channel and the surrounding grass beds showed signs of stress. This area bears the heaviest boat traffic within the Park waters causing continuous turbidity from boat wakes with resulting siltation. The offshore hardbottom and rubble areas in the Sanctuary appeared to be in good health and showed no visible indications of deterioration. Damage by boat groundings and anchors was negligible in the areas surveyed. The outer reefs in general appear to be healthy. Corals have a surprising resiliency to detrimental factors and, when conditions again become favorable, recover quickly from even severe damage. It is, therefore, a cause for concern that Grecian Rocks, which sits somewhat inshore of the outer reef line, has yet to recover from die-off in 1978. The slow recovery, if occurring, may be due to the lower quality of the inshore waters. The patch reefs, more adapted to inshore waters, do not show obvious stress signs, at least those surveyed in this study. It is apparent that water quality was changing in the keys. Water clarity over much of the reef tract was observed to be much reduced from former years and undoubtedly plays an important part in the stresses seen today over the Sanctuary and Park. (PDF contains 119 pages)
Resumo:
The key issues of engineering application of the dual gratings parallel matched interrogation method are expanding the measurable range, improving the usability, and lowering the cost by adopting a compact and simple setup based on existing conditions and improving the precision of the data-processing scheme. A credible and effective data-processing scheme based on a novel divisional look-up table is proposed based on the advantages of other schemes. Any undetermined data is belonged to a certain section, which can be confirmed at first, then it can be looked up in the table to correspond to microstrain by the scheme. It not only solves inherent problems of the traditional one (double value and small measurable range) but also enhances the precision, which improves the performance of the system. From the experimental results, the measurable range of the system is 525 mu epsilon, and the precision is +/- 1 mu epsilon based on normal matched gratings. The system works in real time, which is competent for most engineering measurement requirements. (C) 2007 Elsevier GmbH. All rights reserved.
Resumo:
A parallel optical communication subsystem based on a 12 channels parallel optical transmitter module and a 12 channels parallel optical receiver module can be used as a 10Gbps STM-64 or an OC-192 optical transponder. The bit error rate of this parallel optical communication subsystem is about 0 under the test by SDH optical transport tester during three hours and eighteen minutes.
Resumo:
A new 12 channels parallel optical transmitter module in which a Vertical Cavity Surface Emitting Laser (VCSEL) has been selected as the optical source is capable of transmitting 37.5Gbps date over hundreds meters. A new 12 channels parallel optical receiver module in which a GaAs PIN (p-intrinsic-n-type) array has been selected as the optical receiver unit is capable of responding to 30Gbps date. A transmission system based on a 12 channels parallel optical transmitter module and a 12 channels parallel optical receiver module can be used as a 10Gbps STM-64 or an OC-192 optical transponder. The parallel optical modules and the parallel optical transmission system have passed the test in laboratory.