762 resultados para Trusted computing platform
Resumo:
Abstract in English : Ubiquitous Computing is the emerging trend in computing systems. Based on this observation this thesis proposes an analysis of the hardware and environmental constraints that rule pervasive platforms. These constraints have a strong impact on the programming of such platforms. Therefore solutions are proposed to facilitate this programming both at the platform and node levels. The first contribution presented in this document proposes a combination of agentoriented programming with the principles of bio-inspiration (Phylogenesys, Ontogenesys and Epigenesys) to program pervasive platforms such as the PERvasive computing framework for modeling comPLEX virtually Unbounded Systems platform. The second contribution proposes a method to program efficiently parallelizable applications on each computing node of this platform. Résumé en Français : Basée sur le constat que les calculs ubiquitaires vont devenir le paradigme de programmation dans les années à venir, cette thèse propose une analyse des contraintes matérielles et environnementale auxquelles sont soumises les plateformes pervasives. Ces contraintes ayant un impact fort sur la programmation des plateformes. Des solutions sont donc proposées pour faciliter cette programmation tant au niveau de l'ensemble des noeuds qu'au niveau de chacun des noeuds de la plateforme. La première contribution présentée dans ce document propose d'utiliser une alliance de programmation orientée agent avec les grands principes de la bio-inspiration (Phylogénèse, Ontogénèse et Épigénèse). Ceci pour répondres aux contraintes de programmation de plateformes pervasives comme la plateforme PERvasive computing framework for modeling comPLEX virtually Unbounded Systems . La seconde contribution propose quant à elle une méthode permettant de programmer efficacement des applications parallélisable sur chaque noeud de calcul de la plateforme
Resumo:
A high-resolution carbon and oxygen isotope analysis of Late Oxfordian-Early Kimmeridgian deep-shelf sediments of southern Germany is combined with investigation of nannofossil assemblage composition and sedimentological interpretations in order to evaluate the impact of regional palaeoenvironmental conditions on isotopic composition of carbonates. This study suggests that carbonate mud was essentially derived from the Jura shallow platform environments and also that the isotopic signature of carbonates deposited in the Swabian Alb deep shelf indirectly expresses the palaeoenvironmental evolution of the platform. Short-term fluctuations in delta(13) C and delta(18)O are probably controlled by changes in salinity (fresh-water input versus evaporation) in platform environments. Long-term fluctuations in carbon and oxygen isotope record throughout the Late Oxfordian-Early Kimmeridgian result from the interplay of increasing temperature and decreasing humidity, which both control the trophic level. Changes from mesotrophic to oligotrophic conditions in platform environments and in the deep-shelf surface waters are inferred. During the Late Oxfordian (Bimammatum Subzone to Planula Zone), the delta(13)C curve displays a positive shift of about 1 parts per thousand, which is comparable in intensity to global perturbations of the carbon cycle. This evident isotopic shift has not been documented yet in other basinal settings. It can be reasonably explained by local palaeoenvironmental changes on the Jura platform (salinity, temperature, and nutrient availability) that controlled platform carbonate production, and the geochemistry of overlying waters. However, increasing carbonate production on the Jura platform and related positive delta(13)C shifts recorded in the Swabian Alb deep shelf are the regional signatures of climatic changes affecting other palaeogeographical domains of Europe in which the carbonate production increased throughout the Late Oxfordian. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
This study looks at how increased memory utilisation affects throughput and energy consumption in scientific computing, especially in high-energy physics. Our aim is to minimise energy consumed by a set of jobs without increasing the processing time. The earlier tests indicated that, especially in data analysis, throughput can increase over 100% and energy consumption decrease 50% by processing multiple jobs in parallel per CPU core. Since jobs are heterogeneous, it is not possible to find an optimum value for the number of parallel jobs. A better solution is based on memory utilisation, but finding an optimum memory threshold is not straightforward. Therefore, a fuzzy logic-based algorithm was developed that can dynamically adapt the memory threshold based on the overall load. In this way, it is possible to keep memory consumption stable with different workloads while achieving significantly higher throughput and energy-efficiency than using a traditional fixed number of jobs or fixed memory threshold approaches.
Resumo:
Motivation: Genome-wide association studies have become widely used tools to study effects of genetic variants on complex diseases. While it is of great interest to extend existing analysis methods by considering interaction effects between pairs of loci, the large number of possible tests presents a significant computational challenge. The number of computations is further multiplied in the study of gene expression quantitative trait mapping, in which tests are performed for thousands of gene phenotypes simultaneously. Results: We present FastEpistasis, an efficient parallel solution extending the PLINK epistasis module, designed to test for epistasis effects when analyzing continuous phenotypes. Our results show that the algorithm scales with the number of processors and offers a reduction in computation time when several phenotypes are analyzed simultaneously. FastEpistasis is capable of testing the association of a continuous trait with all single nucleotide polymorphism ( SNP) pairs from 500 000 SNPs, totaling 125 billion tests, in a population of 5000 individuals in 29, 4 or 0.5 days using 8, 64 or 512 processors.
Resumo:
Un âge synchrone (partie moyenne de l'Aptien inférieur) de l'ennoiement de la plate-forme Urgonienne helvétique en relation avec l'événement océanique anoxique 1a ("événement Selli"). - La fin de la plate-forme urgonienne, calibrée par analyse des isotopes stables du carbone sur roche totale et par biostratigraphie basée sur les ammonites, est datée du milieu de l'Aptien inférieur (Près de la limite des zones weissi et deshayesi). Cet arrêt, synchrone dans des coupes représentatives du domaine helvétique alpin, est un événement environemental majeur renregistré en France, en Espagne, au Protugal, en Oman, au Mexique et dans le domaine Pacifique. En tenant compte des limites de résolution de la biostatrigraphie et des autres techniques de datation, cet épisode semble également être synchrone à l'échelle globale. Pour beaucoup d'auteurs, la disparition de récifs de coraux et de rudistes corrélée à la fin de la sédimentation urgonienne correspond à la mise en place de conditions anoxiques à l'Aptien inférieur. Celles-ci caractérisent un événement d'importance global: l'événement anoxique OAE 1a.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
En el presente proyecto se ha abordado la tarea de acercar las tecnologías existentes de plataformas de gestión de infraestructuras ofrecidas en la nube (Cloud Management Platform, aka CMP) al mundo empresarial. En concreto, se ha desplegado una solución de explotación de infraestructuras privadas en la nube (IaaS) enfocada a la gestión de un datacenter virtualizado, utilizando para ello soluciones completamente basadas en software libre, en concreto, OpenNebula.
Resumo:
Long synthetic peptides (LSPs) have a variety of important clinical uses as synthetic vaccines and drugs. Techniques for peptide synthesis were revolutionized in the 1960s and 1980s, after which efficient techniques for purification and characterization of the product were developed. These improved techniques allowed the stepwise synthesis of increasingly longer products at a faster rate, greater purity, and lower cost for clinical use. A synthetic peptide approach, coupled with bioinformatics analysis of genomes, can tremendously expand the search for clinically relevant products. In this Review, we discuss efforts to develop a malaria vaccine from LSPs, among other clinically directed work.
Resumo:
The M-Coffee server is a web server that makes it possible to compute multiple sequence alignments (MSAs) by running several MSA methods and combining their output into one single model. This allows the user to simultaneously run all his methods of choice without having to arbitrarily choose one of them. The MSA is delivered along with a local estimation of its consistency with the individual MSAs it was derived from. The computation of the consensus multiple alignment is carried out using a special mode of the T-Coffee package [Notredame, Higgins and Heringa (T-Coffee: a novel method for fast and accurate multiple sequence alignment. J. Mol. Biol. 2000; 302: 205-217); Wallace, O'Sullivan, Higgins and Notredame (M-Coffee: combining multiple sequence alignment methods with T-Coffee. Nucleic Acids Res. 2006; 34: 1692-1699)] Given a set of sequences (DNA or proteins) in FASTA format, M-Coffee delivers a multiple alignment in the most common formats. M-Coffee is a freeware open source package distributed under a GPL license and it is available either as a standalone package or as a web service from www.tcoffee.org.
Resumo:
Remote control systems are a very useful element to control and monitor devices quickly and easily. This paper proposes a new architecture for remote control of Android mobile devices, analyzing the different alternatives and seeking the optimal solution in each case. Although the area of remote control, in case of mobile devices, has been little explored, it may provide important advantages for testing software and hardware developments in several real devices. It can also allow an efficient management of various devices of different types, perform forensic security tasks, etc ... The main idea behind the proposed architecture was the design of a system to be used as a platform which provides the services needed to perform remote control of mobile devices. As a result of this research, a proof of concept was implemented. An Android application running a group of server programs on the device, connected to the network or USB interface, depending on availability. This servers can be controlled through a small client written in Java and runnable both on desktop and web systems.
Resumo:
Organic geochemical and stable isotope investigations were performed to provide an insight into the depositional environments, origin and maturity of the organic matter in Jurassic and Cretaceous formations of the External Dinarides. A correlation is made among various parameters acquired from Rock-Eval, gas chromatography-mass spectrometry data and isotope analysis of carbonates and kerogen. Three groups of samples were analysed. The first group includes source rocks derived from Lower Jurassic limestone and Upper Jurassic ``Leme'' beds, the second from Upper Cretaceous carbonates, while the third group comprises oil seeps genetically connected with Upper Cretaceous source rocks. The carbon and oxygen isotopic ratios of all the carbonates display marine isotopic composition. Rock-Eval data and maturity parameter values derived from biomarkers define the organic matter of the Upper Cretaceous carbonates as Type I-S and Type II-S kerogen at the low stage of maturity up to entering the oil-generating window. Lower and Upper Jurassic source rocks contain early mature Type III mixed with Type IV organic matter. All Jurassic and Cretaceous potential source rock extracts show similarity in triterpane and sterane distribution. The hopane and sterane distribution pattern of the studied oil seeps correspond to those from Cretaceous source rocks. The difference between Cretaceous oil seeps and potential source rock extracts was found in the intensity and distribution of n-alkanes, as well as in the abundance of asphaltenes which is connected to their biodegradation stage. In the Jurassic and Cretaceous potential source rock samples a mixture of aromatic hydrocarbons with their alkyl derivatives were indicated, whereas in the oil seep samples extracts only asphaltenes were observed.