950 resultados para compliant cryptologic protocols
Resumo:
Transmitting sensitive data over non-secret channels has always required encryption technologies to ensure that the data arrives without exposure to eavesdroppers. The Internet has made it possible to transmit vast volumes of data more rapidly and cheaply and to a wider audience than ever before. At the same time, strong encryption makes it possible to send data securely, to digitally sign it, to prove it was sent or received, and to guarantee its integrity. The Internet and encryption make bulk transmission of data a commercially viable proposition. However, there are implementation challenges to solve before commercial bulk transmission becomes mainstream. Powerful have a performance cost, and may affect quality of service. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Performance degradation and potential for commercial loss discourage the bulk transmission of data over the Internet in any commercial application. This paper outlines technical solutions to these problems. We develop new technologies and combine existing ones in new and powerful ways to minimise commercial loss without compromising performance or inflating overheads.
Resumo:
Secure transmission of bulk data is of interest to many content providers. A commercially-viable distribution of content requires technology to prevent unauthorised access. Encryption tools are powerful, but have a performance cost. Without encryption, intercepted data may be illicitly duplicated and re-sold, or its commercial value diminished because its secrecy is lost. Two technical solutions make it possible to perform bulk transmissions while retaining security without too high a performance overhead. These are: 1. a) hierarchical encryption - the stronger the encryption, the harder it is to break but also the more computationally expensive it is. A hierarchical approach to key exchange means that simple and relatively weak encryption and keys are used to encrypt small chunks of data, for example 10 seconds of video. Each chunk has its own key. New keys for this bottom-level encryption are exchanged using a slightly stronger encryption, for example a whole-video key could govern the exchange of the 10-second chunk keys. At a higher level again, there could be daily or weekly keys, securing the exchange of whole-video keys, and at a yet higher level, a subscriber key could govern the exchange of weekly keys. At higher levels, the encryption becomes stronger but is used less frequently, so that the overall computational cost is minimal. The main observation is that the value of each encrypted item determines the strength of the key used to secure it. 2. b) non-symbolic fragmentation with signal diversity - communications are usually assumed to be sent over a single communications medium, and the data to have been encrypted and/or partitioned in whole-symbol packets. Network and path diversity break up a file or data stream into fragments which are then sent over many different channels, either in the same network or different networks. For example, a message could be transmitted partly over the phone network and partly via satellite. While TCP/IP does a similar thing in sending different packets over different paths, this is done for load-balancing purposes and is invisible to the end application. Network and path diversity deliberately introduce the same principle as a secure communications mechanism - an eavesdropper would need to intercept not just one transmission path but all paths used. Non-symbolic fragmentation of data is also introduced to further confuse any intercepted stream of data. This involves breaking up data into bit strings which are subsequently disordered prior to transmission. Even if all transmissions were intercepted, the cryptanalyst still needs to determine fragment boundaries and correctly order them. These two solutions depart from the usual idea of data encryption. Hierarchical encryption is an extension of the combined encryption of systems such as PGP but with the distinction that the strength of encryption at each level is determined by the "value" of the data being transmitted. Non- symbolic fragmentation suppresses or destroys bit patterns in the transmitted data in what is essentially a bit-level transposition cipher but with unpredictable irregularly-sized fragments. Both technologies have applications outside the commercial and can be used in conjunction with other forms of encryption, being functionally orthogonal.
Resumo:
A rapid rate and high percentage of macadamia nut germination, together with production of vigorous seedlings, are required by nurseries and breeding programs. Germination of nuts is typically protracted, however, and rarely reaches 100%. Many studies have been conducted into macadamia germination, but most have assessed percent germination only. This study investigated the effects of various treatments on percent germination, germination rate, and plant, shoot and root dry weights. The treatments tested were combinations of: (i) soaking or not soaking seeds in a dilute fungicide solution prior to planting; (ii) four different planting media; and (iii) leaving seed trays open or placing them inside clear plastic bags. For freshly harvested nuts, sowing in potting mix under clear plastic and without soaking produced the highest percent germination and germination rate, the largest shoots, and longest lateral roots.
Resumo:
Background: Understanding transcriptional regulation by genome-wide microarray studies can contribute to unravel complex relationships between genes. Attempts to standardize the annotation of microarray data include the Minimum Information About a Microarray Experiment (MIAME) recommendations, the MAGE-ML format for data interchange, and the use of controlled vocabularies or ontologies. The existing software systems for microarray data analysis implement the mentioned standards only partially and are often hard to use and extend. Integration of genomic annotation data and other sources of external knowledge using open standards is therefore a key requirement for future integrated analysis systems. Results: The EMMA 2 software has been designed to resolve shortcomings with respect to full MAGE-ML and ontology support and makes use of modern data integration techniques. We present a software system that features comprehensive data analysis functions for spotted arrays, and for the most common synthesized oligo arrays such as Agilent, Affymetrix and NimbleGen. The system is based on the full MAGE object model. Analysis functionality is based on R and Bioconductor packages and can make use of a compute cluster for distributed services. Conclusion: Our model-driven approach for automatically implementing a full MAGE object model provides high flexibility and compatibility. Data integration via SOAP-based web-services is advantageous in a distributed client-server environment as the collaborative analysis of microarray data is gaining more and more relevance in international research consortia. The adequacy of the EMMA 2 software design and implementation has been proven by its application in many distributed functional genomics projects. Its scalability makes the current architecture suited for extensions towards future transcriptomics methods based on high-throughput sequencing approaches which have much higher computational requirements than microarrays.
Resumo:
This paper proposes a simple and compact compliant gripper, whose gripping stiffness can be thermally controlled to accommodate the actuation inaccuracy to avoid or reduce the risk of breaking objects. The principle of reducing jaw stiffness is that thermal change can cause an initial internal compressive force along each compliant beam. A prototype is fabricated with physical testing to verify the feasibility. It has been shown that when a voltage is applied, the gripping stiffness effectively reduces to accommodate more inaccuracy of actuation, which allows delicate or small-scale objects to be gripped.
Resumo:
Since precise linear actuators of a compliant parallel manipulator suffer from their inability to tolerate the transverse motion/load in the multi-axis motion, actuation isolation should be considered in the compliant manipulator design to eliminate the transverse motion at the point of actuation. This paper presents an effective design method for constructing compliant parallel manipulators with actuation isolation, by adding the same number of actuation legs as the number of the DOF (degree of freedom) of the original mechanism. The method is demonstrated by two design case studies, one of which is quantitatively studied by analytical modelling. The modelling results confirm possible inherent issues of the proposed structure design method such as increased primary stiffness, introduced extra parasitic motions and cross-axis coupling motions.
Resumo:
The past few decades have seen major impacts of different pandemics and mass casualty events on health resource use in terms of rising health cost and increased mortality.
Resumo:
The conservation and valorisation of cultural heritage is of fundamental importance for our society, since it is witness to the legacies of human societies. In the case of metallic artefacts, because corrosion is a never-ending problem, the correct strategies for their cleaning and preservation must be chosen. Thus, the aim of this project was the development of protocols for cleaning archaeological copper artefacts by laser and plasma cleaning, since they allow the treatment of artefacts in a controlled and selective manner. Additionally, electrochemical characterisation of the artificial patinas was performed in order to obtain information on the protective properties of the corrosion layers. Reference copper samples with different artificial corrosion layers were used to evaluate the tested parameters. Laser cleaning tests resulted in partial removal of the corrosion products, but the lasermaterial interactions resulted in melting of the desired corrosion layers. The main obstacle for this process is that the materials that must be preserved show lower ablation thresholds than the undesired layers, which makes the proper elimination of dangerous corrosion products very difficult without damaging the artefacts. Different protocols should be developed for different patinas, and real artefacts should be characterised previous to any treatment to determine the best course of action. Low pressure hydrogen plasma cleaning treatments were performed on two kinds of patinas. In both cases the corrosion layers were partially removed. The total removal of the undesired corrosion products can probably be achieved by increasing the treatment time or applied power, or increasing the hydrogen pressure. Since the process is non-invasive and does not modify the bulk material, modifying the cleaning parameters is easy. EIS measurements show that, for the artificial patinas, the impedance increases while the patina is growing on the surface and then drops, probably due to diffusion reactions and a slow dissolution of copper. It appears from these results that the dissolution of copper is heavily influenced by diffusion phenomena and the corrosion product film porosity. Both techniques show good results for cleaning, as long as the proper parameters are used. These depend on the nature of the artefact and the corrosion layers that are found on its surface.
Resumo:
The objective of this study, was to evaluate the exogenous FSH dose effect on gonadotrophic treatment over ewes ovulatory follicle dynamics.
Resumo:
The objective of this thesis is the analysis and the study of the various access techniques for vehicular communications, in particular of the C-V2X and WAVE protocols. The simulator used to study the performance of the two protocols is called LTEV2Vsim and was developed by the CNI IEIIT for the study of V2V (Vehicle-to-Vehicle) communications. The changes I made allowed me to study the I2V (Infrastructure-to-Vehicle) scenario in highway areas and, with the results obtained, I made a comparison between the two protocols in the case of high vehicular density and low vehicular density, putting in relation to the PRR (packet reception ratio) and the cell size (RAW, awareness range). The final comparison allows to fully understand the possible performances of the two protocols and highlights the need for a protocol that allows to reach the minimum necessary requirements.
Resumo:
In the central nervous system, iron in several proteins is involved in many important processes: oxygen transportation, oxidative phosphorylation, mitochondrial respiration, myelin production, the synthesis and metabolism of neurotransmitters. Abnormal iron homoeostasis can induce cellular damage through hydroxyl radical production, which can cause the oxidation, modification of lipids, proteins, carbohydrates, and DNA, lead to neurotoxicity. Moreover increased levels of iron are harmful and iron accumulations are typical hallmarks of brain ageing and several neurodegenerative disorders particularly PD. Numerous studies on post mortem tissue report on an increased amount of total iron in the substantia nigra in patients with PD also supported by large body of in vivo findings from Magnetic Resonance Imaging (MRI) studies. The importance and approaches for in vivo brain iron assessment using multiparametric MRI is increased over last years. Quantitative MRI may provide useful biomarkers for brain integrity assessment in iron-related neurodegeneration. Particularly, a prominent change in iron- sensitive T2* MRI contrast within the sub areas of the SN overlapping with nigrosome 1 were shown to be a hallmark of Parkinson's Disease with high diagnostic accuracy. Moreover, differential diagnosis between Parkinson's Disease (PD) and atypical parkinsonian syndromes (APS) remains challenging, mainly in the early phases of the disease. Advanced brain MR imaging enables to detect the pathological changes of nigral and extranigral structures at the onset of clinical manifestations and during the course of the disease. The Nigrosome-1 (N1) is a substructure of the healthy Substantia Nigra pars compacta enriched by dopaminergic neurons; their loss in Parkinson’s disease and atypical parkinsonian syndromes is related to the iron accumulation. N1 changes are supportive MR biomarkers for diagnosis of these neurodegenerative disorders, but its detection is hard with conventional sequences, also using high field (3T) scanner. Quantitative susceptibility mapping (QSM), an iron-sensitive technique, enables the direct detection of Neurodegeneration
Resumo:
Cleaning is one of the most important and delicate procedures that are part of the restoration process. When developing new systems, it is fundamental to consider its selectivity towards the layer to-be-removed, non-invasiveness towards the one to-be-preserved, its sustainability and non-toxicity. Besides assessing its efficacy, it is important to understand its mechanism by analytical protocols that strike a balance between cost, practicality, and reliable interpretation of results. In this thesis, the development of cleaning systems based on the coupling of electrospun fabrics (ES) and greener organic solvents is proposed. Electrospinning is a versatile technique that allows the production of micro/nanostructured non-woven mats, which have already been used as absorbents in various scientific fields, but to date, not in the restoration field. The systems produced proved to be effective for the removal of dammar varnish from paintings, where the ES not only act as solvent-binding agents but also as adsorbents towards the partially solubilised varnish due to capillary rise, thus enabling a one-step procedure. They have also been successfully applied for the removal of spray varnish from marble substrates and wall paintings. Due to the materials' complexity, the procedure had to be adapted case-by-case and mechanical action was still necessary. According to the spinning solution, three types of ES mats have been produced: polyamide 6,6, pullulan and pullulan with melanin nanoparticles. The latter, under irradiation, allows for a localised temperature increase accelerating and facilitating the removal of less soluble layers (e.g. reticulated alkyd-based paints). All the systems produced, and the mock-ups used were extensively characterised using multi-analytical protocols. Finally, a monitoring protocol and image treatment based on photoluminescence macro-imaging is proposed. This set-up allowed the study of the removal mechanism of dammar varnish and semi-quantify its residues. These initial results form the basis for optimising the acquisition set-up and data processing.