36 resultados para Key Management Protocol


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is no empirical evidence whatsoever to support most of the beliefs on which software construction is based. We do not yet know the adequacy, limits, qualities, costs and risks of the technologies used to develop software. Experimentation helps to check and convert beliefs and opinions into facts. This research is concerned with the replication area. Replication is a key component for gathering empirical evidence on software development that can be used in industry to build better software more efficiently. Replication has not been an easy thing to do in software engineering (SE) because the experimental paradigm applied to software development is still immature. Nowadays, a replication is executed mostly using a traditional replication package. But traditional replication packages do not appear, for some reason, to have been as effective as expected for transferring information among researchers in SE experimentation. The trouble spot appears to be the replication setup, caused by version management problems with materials, instruments, documents, etc. This has proved to be an obstacle to obtaining enough details about the experiment to be able to reproduce it as exactly as possible. We address the problem of information exchange among experimenters by developing a schema to characterize replications. We will adapt configuration management and product line ideas to support the experimentation process. This will enable researchers to make systematic decisions based on explicit knowledge rather than assumptions about replications. This research will output a replication support web environment. This environment will not only archive but also manage experimental materials flexibly enough to allow both similar and differentiated replications with massive experimental data storage. The platform should be accessible to several research groups working together on the same families of experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, processing Industry Sector is going through a series of changes, including right management and reduction of environmental affections. Any productive process which looks for sustainable management is incomplete if Cycle of Life of mineral resources sustainability is not taken into account. Raw materials for manufacturing are provided by mineral resources extraction processes, such as copper, aluminum, iron, gold, silver, silicon, titanium? Those elements are necessary for Mankind development and are obtained from the Earth through mineral extractive processes. Mineral extraction processes are operations which must take care about the environmental consequences. Extraction of huge volumes of rock for their transformation into raw materials for industry must be optimized to reduce ecological cost of the final product as l was possible. Reducing the ecological balance on a global scale has no sense to design an efficient manufacturing in secondary industry (transformation), if in first steps of the supply chain (extraction) impact exceeds the savings of resources in successive phases. Mining operations size suggests that it is an environmental aggressive activity, but precisely because of its great impact must be the first element to be considered. That idea implies that a new concept born: Reduce economical and environmental cost This work aims to make a reflection on the parameters that can be modified to reduce the energy cost of the process without an increasing in operational costs and always ensuring the same production capacity. That means minimize economic and environmental cost at same time. An efficient design of mining operation which has taken into account that idea does not implies an increasing of the operating cost. To get this objective is necessary to think in global operation view to make that all departments involved have common guidelines which make you think in the optimization of global energy costs. Sometimes a single operational cost must be increased to reduce global cost. This work makes a review through different design parameters of surface mining setting some key performance indicators (KPIs) which are estimated from an efficient point of view. Those KPIs can be included by HQE Policies as global indicators. The new concept developed is that a new criteria has to be applied in company policies: improve management, improving OPERATIONAL efficiency. That means, that is better to use current resources properly (machinery, equipment,?) than to replace them with new things but not used correctly. As a conclusion, through an efficient management of current technologies in each extractive operation an important reduction of the energy can be achieved looking at downstream in the process. That implies a lower energetic cost in the whole cycle of life in manufactured product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of cloud computing, many applications have embraced the ensuing paradigm shift towards modern distributed key-value data stores, like HBase, in order to benefit from the elastic scalability on offer. However, many applications still hesitate to make the leap from the traditional relational database model simply because they cannot compromise on the standard transactional guarantees of atomicity, isolation, and durability. To get the best of both worlds, one option is to integrate an independent transaction management component with a distributed key-value store. In this paper, we discuss the implications of this approach for durability. In particular, if the transaction manager provides durability (e.g., through logging), then we can relax durability constraints in the key-value store. However, if a component fails (e.g., a client or a key-value server), then we need a coordinated recovery procedure to ensure that commits are persisted correctly. In our research, we integrate an independent transaction manager with HBase. Our main contribution is a failure recovery middleware for the integrated system, which tracks the progress of each commit as it is flushed down by the client and persisted within HBase, so that we can recover reliably from failures. During recovery, commits that were interrupted by the failure are replayed from the transaction management log. Importantly, the recovery process does not interrupt transaction processing on the available servers. Using a benchmark, we evaluate the impact of component failure, and subsequent recovery, on application performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the SESAR Step 2 concept of operations a RBT is available and seen by all making it possible to conceive a different operating method than the current ATM system based on Collaborative Decisions Making processes. Currently there is a need to describe in more detail the mechanisms by which actors (ATC, Network Management, Flight Crew, airports and Airline Operation Centre) will negotiate revisions to the RBT. This paper introduces a negotiation model, which uses constraint based programing applied to a mediator to facilitate negotiation process in a SWIM enabled environment. Three processes for modelling the negotiation process are explained as well a preliminary reasoning agent algorithm modelled with constraint satisfaction problem is presented. Computational capability of the model is evaluated in the conclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A participatory modelling process has been conducted in two areas of the Guadiana river (the upper and the middle sub-basins), in Spain, with the aim of providing support for decision making in the water management field. The area has a semi-arid climate where irrigated agriculture plays a key role in the economic development of the region and accounts for around 90% of water use. Following the guidelines of the European Water Framework Directive, we promote stakeholder involvement in water management with the aim to achieve an improved understanding of the water system and to encourage the exchange of knowledge and views between stakeholders in order to help building a shared vision of the system. At the same time, the resulting models, which integrate the different sectors and views, provide some insight of the impacts that different management options and possible future scenarios could have. The methodology is based on a Bayesian network combined with an economic model and, in the middle Guadiana sub-basin, with a crop model. The resulting integrated modelling framework is used to simulate possible water policy, market and climate scenarios to find out the impacts of those scenarios on farm income and on the environment. At the end of the modelling process, an evaluation questionnaire was filled by participants in both sub-basins. Results show that this type of processes are found very helpful by stakeholders to improve the system understanding, to understand each others views and to reduce conflict when it exists. In addition, they found the model an extremely useful tool to support management. The graphical interface, the quantitative output and the explicit representation of uncertainty helped stakeholders to better understand the implications of the scenario tested. Finally, the combination of different types of models was also found very useful, as it allowed exploring in detail specific aspects of the water management problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a meta-analysis of long/short distance passenger interconnectivity within the European context. The analysis is based on the results of the European project HERMES of the 7th EU R&D Programme. The study collected stakeholders and travelers’ valuation and preferences in 5 interchanges in 3 EU countries. To that end a common survey was conducted in the following sites: Gothenburg Central Station (Sweden), Avenida de America Interchange in Madrid, Lleida-Zaragoza railway stations (Spain), and the Intermodal Station of Part Dieu in Lyon (France). The first survey addresses the analysis of the different stakeholders’ opinion on the interchange management and characteristics. The second survey gives an insight into the key requirements of long/short distance intermodal passengers in the selected case studies. This included the following aspects: on one hand, trip origin and destination, connecting transport services and modes, trip characteristics, type of ticket, trip motive and socioeconomic characteristics of the traveller. On the other hand, it was structured in such a way to ask passengers to rate importance/satisfaction of a series of common quality and functional aspects like information, accessibility, transfer times, service supply, etc. In conclusion, the paper highlights which elements of the interchange are considered as relevant and how different groups of stakeholders value them, both theoretically and in the selected case studies. They also have identified some key barriers as the lack of internal coordination among operators, managers and decision makers, as well as the the poor signage, particularly among connecting services. Travellers seem to have different priorities depending on their age, purpose of trip and mode chosen. In some cases time appears as the most relevant factor, whilst price is decisive in others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In hostile environments at CERN and other similar scientific facilities, having a reliable mobile robot system is essential for successful execution of robotic missions and to avoid situations of manual recovery of the robots in the event that the robot runs out of energy. Because of environmental constraints, such mobile robots are usually battery-powered and hence energy management and optimization is one of the key challenges in this field. The ability to know beforehand the energy consumed by various elements of the robot (such as locomotion, sensors, controllers, computers and communication) will allow flexibility in planning or managing the tasks to be performed by the robot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the relationship between productive efficiency and online-social-networks (OSN) in Spanish telecommunications firms. A data-envelopment-analysis (DEA) is used and several indicators of business ?social Media? activities are incorporated. A super-efficiency analysis and bootstrapping techniques are performed to increase the model?s robustness and accuracy. Then, a logistic regression model is applied to characterise factors and drivers of good performance in OSN. Results reveal the company?s ability to absorb and utilise OSNs as a key factor in improving the productive efficiency. This paper presents a model for assessing the strategic performance of the presence and activity in OSN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cascade is an information reconciliation protocol proposed in the context of secret key agreement in quantum cryptography. This protocol allows removing discrepancies in two partially correlated sequences that belong to distant parties, connected through a public noiseless channel. It is highly interactive, thus requiring a large number of channel communications between the parties to proceed and, although its efficiency is not optimal, it has become the de-facto standard for practical implementations of information reconciliation in quantum key distribution. The aim of this work is to analyze the performance of Cascade, to discuss its strengths, weaknesses and optimization possibilities, comparing with some of the modified versions that have been proposed in the literature. When looking at all design trade-offs, a new view emerges that allows to put forward a number of guidelines and propose near optimal parameters for the practical implementation of Cascade improving performance significantly in comparison with all previous proposals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Secret-key agreement, a well-known problem in cryptography, allows two parties holding correlated sequences to agree on a secret key communicating over a public channel. It is usually divided into three different procedures: advantage distillation, information reconciliation and privacy amplification. The efficiency of each one of these procedures is needed if a positive key rate is to be attained from the legitimate parties? correlated sequences. Quantum key distribution (QKD) allows the two parties to obtain correlated sequences, provided that they have access to an authenticated channel. The new generation of QKD devices is able to work at higher speeds and in noisier or more absorbing environments. This exposes the weaknesses of current information reconciliation protocols, a key component to their performance. Here we present a new protocol based in low-density parity-check (LDPC) codes that presents the advantages of low interactivity, rate adaptability and high efficiency,characteristics that make it highly suitable for next generation QKD devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Las infraestructuras de telecomunicaciones son las que forman la capa física para la transmisión de la información de la que se componen las comunicaciones. Según el modelo OSI la capa física se encarga de convertir la trama que recibe (del nivel de enlace) en una serie de bits que envía a través del medio de transmisión correspondiente hacia el sistema destino, liberando a la capa superior de las funciones que imponga la naturaleza particular del medio de transmisión que se utilice. Para ello define las características mecánicas, eléctricas y funcionales de la interconexión al medio físico estableciendo además una interfaz con su capa superior (el nivel de enlace). Dependiendo del medio y el modo de transmisión así como de la topología de la red, el tipo de codificación y configuración de la línea y el tipo de comunicación deseada se requiere de un equipamiento u otro, por lo que la infraestructura de comunicaciones cambia. La complejidad de las redes de comunicaciones (multitud de servicios a multitud de destinos) hace que la gestión de la capa física (o de infraestructura) de las comunicaciones sea un reto difícil para los gestores de las telecomunicaciones en las empresas u organismos públicos. Ya que conseguir una correcta administración de las infraestructuras de telecomunicaciones es un factor clave para garantizar la calidad del servicio, optimizar los tiempos de provisión a los clientes y minimizar la indisponibilidad de la red ante incidencias. Si bien existen diferentes herramientas para la gestión de las telecomunicaciones la mayoría de estas soluciones contempla de manera limitada la capa física, dejando a los gestores con una multitud de aproximaciones, más o menos manuales, para entender y conocer qué pasa en su red a nivel físico y lo que puede ser aún más grave, sin la capacidad de reacción rápida ante la aparición de una incidencia. Para resolver este problema se hace necesaria la capacidad de gestión extremo a extremo de los circuitos y de todas sus conexiones intermedias. Esto es, se necesita implantar una metodología que modele la red de comunicaciones de manera que se pueda representar en un sistema informático y sobre él facilitar la gestión de los circuitos físicos y de sus infraestructuras asociadas. Por ello, la primera parte del proyecto consistirá en la descripción del tipo de infraestructura de telecomunicaciones a gestionar, el estudio de las soluciones actuales de gestión de red y el análisis de las estrategias que se están considerando para permitir la gestión de la capa física. La segunda parte estará dedicada a la definición de una metodología para la representación de la capa física en un sistema informático, de manera que se proporcione una solución completa a las organizaciones para la gestión eficaz de su infraestructura de telecomunicaciones. Y la tercera parte se centrará en la realización de un ejemplo real (piloto) de implantación de esta metodología para un proyecto concreto de una red de comunicaciones. Con objeto de mostrar las prestaciones de la solución propuesta. ABSTRACT. Telecommunications infrastructures have the physical layer component for the transfer of data. As defined in OSI model the physical layer performs the conversion of data received to binary digits which are sent through the transmission devices towards the target system, thus freeing the top layer from defining the functional specifics of each device used. This requires the full definition of the mechanical, electrical and functional features within the physical environment and the implementation of an interface with the top layer. Dependencies on the environment and the transmission modes as well as the network’s topology, the type of protocol and the line’s configuration and the type of communication selected provide specific requirements which define the equipment needed. This may also require changes in the communications environment. Current networks’ complexity (many different types of services to many nodes) demand an efficient management of the physical layer and the infrastructure in enterprises and the public sector agencies thus becoming a challenging task to the responsible for administering the telecommunications infrastructure which is key to provide high quality of service with the need to avoid any disruption of service. We have in the market different tools supporting telecommunications management but most of these solutions have limited functionality for the physical layer, leaving to administrators with the burden of executing manual tasks which need to be performed in order to attain the desired level of control which facilitates the decision process when incidents occur. An adequate solution requires an end to end capacity management of the circuits and all intermediate connections. We must implement a methodology to model the communications network to be able of representing an entire IT system to manage circuitry and associated infrastructure components. For the above purpose, the first part of the Project includes a complete description of the type of communications infrastructure to manage, the study of the current solutions available in network management and an analysis of the strategies in scope for managing the physical layer. The second part is dedicated to the definition of a methodology for the presentation of the physical layer in an IT system with the objective of providing a complete solution to the responsible staffs for efficiently managing a telecommunications infrastructure. The third part focuses on the deployment of a pilot using this methodology in a specific project performed on a communications network. Purpose is to show the deliverables of the proposed solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advancement of science and engineering projects is brewing major changes in the various phases of a project. These changes have produced more rigorous aspects of project management that tracks the research fronts of engineering and project management becomes key. However, research in engineering and project management in Spanish is hindered by access to information to enable the person concerned to ascertain the most recent and current research, limiting the exchange of information and strengthening research networks in this field interest with great implications in business, industry and scientific issues. Therefore, the article aims to present the state of the art of engineering research and project management in Spanish, using the analysis of scientific domains and network analysis of the research literature to identify and analyze relationships between authors and documents that establish the base and research fronts topic under study. The results also provide statistics on the contribution of international research in Spanish and scientific collaboration networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Personal data about users (customers) is a key component for enterprises and large organizations. Its correct analysis and processing can produce relevant knowledge to achieve different business goals. For example, the monetisation of this data has become a valuable asset for many companies, such as Google, Facebook or Twitter, that obtain huge profits mainly from targeted advertising.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emergency management is one of the key aspects within the day-to-day operation procedures in a highway. Efficiency in the overall response in case of an incident is paramount in reducing the consequences of any incident. However, the approach of highway operators to the issue of incident management is still usually far from a systematic, standardized way. This paper attempts to address the issue and provide several hints on why this happens, and a proposal on how the situation could be overcome. An introduction to a performance based approach to a general system specification will be described, and then applied to a particular road emergency management task. A real testbed has been implemented to show the validity of the proposed approach. Ad-hoc sensors (one camera and one laser scanner) were efficiently deployed to acquire data, and advanced fusion techniques applied at the processing stage to reach the specific user requirements in terms of functionality, flexibility and accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider in this thesis the problem of information reconciliation in the context of secret key distillation between two legitimate parties. In some scenarios of interest this problem can be advantageously solved with low density parity check (LDPC) codes optimized for the binary symmetric channel. In particular, we demonstrate that our method leads to a significant efficiency improvement, with respect to earlier interactive reconciliation methods. We propose a protocol based on LDPC codes that can be adapted to changes in the communication channel extending the original source. The efficiency of our protocol is only limited by the quality of the code and, while transmitting more information than needed to reconcile Alice’s and Bob’s sequences, it does not reveal any more information on the original source than an ad-hoc code would have revealed.---ABSTRACT---En esta tesis estudiamos el problema de la reconciliación de información en el contexto de la destilación de secreto entre dos partes. En algunos escenarios de interés, códigos de baja densidad de ecuaciones de paridad (LDPC) adaptados al canal binario simétrico ofrecen una buena solución al problema estudiado. Demostramos que nuestro método mejora significativamente la eficiencia de la reconciliación. Proponemos un protocolo basado en códigos LDPC que puede ser adaptado a cambios en el canal de comunicaciones mediante una extensión de la fuente original. La eficiencia de nuestro protocolo está limitada exclusivamente por el código utilizado y no revela información adicional sobre la fuente original que la que un código con la tasa de información adaptada habría revelado.