976 resultados para Secure Multi-Party Computation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the Introduction. In 2012, China approached the countries of Central-Eastern Europe (CEE) with a proposal concerning regional cooperation in the ‘16+1’ formula. According to Chinese analysts, the rationale behind this breakthrough decision was Beijing’s acknowledgment of the growing importance of the region’s states within the European Union as well as a partial elimination of the ideological differences which had hamstrung cooperation in previous years. It seems that the eurozone crisis may be perceived as the reason for the CEE states’ increased interest in developing their cooperation with China. These circumstances have opened a ‘window of opportunity’ which Beijing has decided to exploit to create a kind of bridgehead in the region which it could later use in its further economic expansion in Europe. Apart from opening the CEE region up for investments, the ‘16+1’ format was intended to facilitate the shaping of relations between China and the EU and to become a tool in building a positive image for China. Chinese experts agree that after three years of functioning, the ‘16+1’ regional cooperation format has helped Beijing achieve its goals only to a limited extent. The major obstacles have included: the immense diversification of the region, barriers related to EU law, insufficient expertise on the part of Chinese companies, the asymmetry of economic needs on both sides, and no willingness within the region itself to develop cooperation. Regardless of the limited effectiveness of activities carried out so far, China has continued its ‘16+1’ initiative. This continuation and the progressing institutionalisation of cooperation in the ‘16+1’ format have often seemed superficial. China has been using this multi-party formula to improve its long-term bilateral relations with selected states in the region and thereby to create a basis for Beijing’s political and economic presence in Central-Eastern Europe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

After the electoral reform in 1994, Japan saw a gradual evolution from a multi-party system toward a two-party system over the course of five House of Representatives election cycles. In contrast, after Taiwan’s constitutional amendment in 2005, a two-party system emerged in the first post-reform legislative election in 2008. Critically, however, Taiwan’s president is directly elected while Japan’s prime minister is indirectly elected. The contributors conclude that the higher the payoffs of holding the executive office and the greater degree of cross-district coordination required to win it, the stronger the incentives for elites to form and stay in the major parties. In such a context, a country will move rapidly toward a two-party system. In Part II, the contributors apply this theoretical logic to other countries with mixed-member systems to demonstrate its generality. They find the effect of executive competition on legislative electoral rules in countries as disparate as Thailand, the Philippines, New Zealand, Bolivia, and Russia. The findings presented in this book have important implications for political reform. Often, reformers are motivated by high hopes of solving some political problems and enhancing the quality of democracy. But, as this group of scholars demonstrates, electoral reform alone is not a panacea. Whether and to what extent it achieves the advocated goals depends not only on the specification of new electoral rules per se but also on the political context—and especially the constitutional framework—within which such rules are embedded.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Distributed and/or composite web applications are driven by intercommunication via web services, which employ application-level protocols, such as SOAP. However, these protocols usually rely on the classic HTTP for transportation. HTTP is quite efficient for what it does — delivering web page content, but has never been intended to carry complex web service oriented communication. Today there exist modern protocols that are much better fit for the job. Such a candidate is XMPP. It is an XML-based, asynchronous, open protocol that has built-in security and authentication mechanisms and utilizes a network of federated servers. Sophisticated asynchronous multi-party communication patterns can be established, effectively aiding web service developers. This paper’s purpose is to prove by facts, comparisons, and practical examples that XMPP is not only better suited than HTTP to serve as middleware for web service protocols, but can also contribute to the overall development state of web services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nearly one million Asian Indians have immigrated to the United States. Asian Indians are scattered across wide geographic areas. While some have chosen transnationalism, most are taking the traditional route of building ethnic communities. Using what the West has to offer in terms of communication and transportation technologies, they are constructing communities without geographic boundaries, with the extensive use of the automobile and air transportation, cellular phones, FAX machines, commercial delivery services, multi-party conference phones. Also, they are incorporating the American tradition of immigrant associations into an Asian Indian syncretism of community. In the process there emerges an Asian Indian identity and concept of kinship and community. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, the development of domain-specific communication applications is both time-consuming and error-prone because the low-level communication services provided by the existing systems and networks are primitive and often heterogeneous. Multimedia communication applications are typically built on top of low-level network abstractions such as TCP/UDP socket, SIP (Session Initiation Protocol) and RTP (Real-time Transport Protocol) APIs. The User-centric Communication Middleware (UCM) is proposed to encapsulate the networking complexity and heterogeneity of basic multimedia and multi-party communication for upper-layer communication applications. And UCM provides a unified user-centric communication service to diverse communication applications ranging from a simple phone call and video conferencing to specialized communication applications like disaster management and telemedicine. It makes it easier to the development of domain-specific communication applications. The UCM abstraction and API is proposed to achieve these goals. The dissertation also tries to integrate the formal method into UCM development process. The formal model is created for UCM using SAM methodology. Some design errors are found during model creation because the formal method forces to give the precise description of UCM. By using the SAM tool, formal UCM model is translated to Promela formula model. In the dissertation, some system properties are defined as temporal logic formulas. These temporal logic formulas are manually translated to promela formulas which are individually integrated with promela formula model of UCM and verified using SPIN tool. Formal analysis used here helps verify the system properties (for example multiparty multimedia protocol) and dig out the bugs of systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present thesis explores how interaction is initiated in multi-party meetings in Adobe Connect, 7.0, with a particular focus on how co-presence and mutual availability are established through the preambles of 18 meetings held in Spanish without a moderator. Taking Conversation Analysis (CA) as a methodological point of departure, this thesis comprises four different studies, each of them analyzing a particular phenomenon within the interaction of the preambles in a multimodal environment that allows simultaneous interaction through video, voice and text-chat. The first study (Artículo I) shows how participants solve jointly the issue of availability in a technological environment where being online is not necessarily understood as being available for communicating. The second study (Artículo II) focuses on the beginning of the audiovisual interaction; in particular on how participants check the right functioning of the audiovisual mode. The third study (Artículo III) explores silences within the interaction of the preamble. It shows that the length of gaps and lapses become a significant aspect the preambles and how they are connected to the issue of availability.  Finally, the four study introduces the notion of modal alignment, an interactional phenomenon that systematically appears in the beginnings of the encounters, which seems to be used and understood  as a strategy for the establishment of mutual availability and negotiation of the participation framework. As a whole, this research shows how participants, in order to establish mutual co-presence and availability, adapt to a particular technology in terms of participation management, deploying strategies and conveying successive actions which, as it is the case of the activation of their respective webcams, seem to be understood as predictable within the intricate process of establishing mutual availability before the meeting starts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electoral researchers are so much accustomed to analyzing the choice of the single most preferred party as the left-hand side variable of their models of electoral behavior that they often ignore revealed preference data. Drawing on random utility theory, their models predict electoral behavior at the extensive margin of choice. Since the seminal work of Luce and others on individual choice behavior, however, many social science disciplines (consumer research, labor market research, travel demand, etc.) have extended their inventory of observed preference data with, for instance, multiple paired comparisons, complete or incomplete rankings, and multiple ratings. Eliciting (voter) preferences using these procedures and applying appropriate choice models is known to considerably increase the efficiency of estimates of causal factors in models of (electoral) behavior. In this paper, we demonstrate the efficiency gain when adding additional preference information to first preferences, up to full ranking data. We do so for multi-party systems of different sizes. We use simulation studies as well as empirical data from the 1972 German election study. Comparing the practical considerations for using ranking and single preference data results in suggestions for choice of measurement instruments in different multi-candidate and multi-party settings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new high performance architecture for the computation of all the DCT operations adopted in the H.264/AVC and HEVC standards is proposed in this paper. Contrasting to other dedicated transform cores, the presented multi-standard transform architecture is supported on a completely configurable, scalable and unified structure, that is able to compute not only the forward and the inverse 8×8 and 4×4 integer DCTs and the 4×4 and 2×2 Hadamard transforms defined in the H.264/AVC standard, but also the 4×4, 8×8, 16×16 and 32×32 integer transforms adopted in HEVC. Experimental results obtained using a Xilinx Virtex-7 FPGA demonstrated the superior performance and hardware efficiency levels provided by the proposed structure, which outperforms its more prominent related designs by at least 1.8 times. When integrated in a multi-core embedded system, this architecture allows the computation, in real-time, of all the transforms mentioned above for resolutions as high as the 8k Ultra High Definition Television (UHDTV) (7680×4320 @ 30fps).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Earthworks tasks aim at levelling the ground surface at a target construction area and precede any kind of structural construction (e.g., road and railway construction). It is comprised of sequential tasks, such as excavation, transportation, spreading and compaction, and it is strongly based on heavy mechanical equipment and repetitive processes. Under this context, it is essential to optimize the usage of all available resources under two key criteria: the costs and duration of earthwork projects. In this paper, we present an integrated system that uses two artificial intelligence based techniques: data mining and evolutionary multi-objective optimization. The former is used to build data-driven models capable of providing realistic estimates of resource productivity, while the latter is used to optimize resource allocation considering the two main earthwork objectives (duration and cost). Experiments held using real-world data, from a construction site, have shown that the proposed system is competitive when compared with current manual earthwork design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Error-correcting codes and matroids have been widely used in the study of ordinary secret sharing schemes. In this paper, the connections between codes, matroids, and a special class of secret sharing schemes, namely, multiplicative linear secret sharing schemes (LSSSs), are studied. Such schemes are known to enable multiparty computation protocols secure against general (nonthreshold) adversaries.Two open problems related to the complexity of multiplicative LSSSs are considered in this paper. The first one deals with strongly multiplicative LSSSs. As opposed to the case of multiplicative LSSSs, it is not known whether there is an efficient method to transform an LSSS into a strongly multiplicative LSSS for the same access structure with a polynomial increase of the complexity. A property of strongly multiplicative LSSSs that could be useful in solving this problem is proved. Namely, using a suitable generalization of the well-known Berlekamp–Welch decoder, it is shown that all strongly multiplicative LSSSs enable efficient reconstruction of a shared secret in the presence of malicious faults. The second one is to characterize the access structures of ideal multiplicative LSSSs. Specifically, the considered open problem is to determine whether all self-dual vector space access structures are in this situation. By the aforementioned connection, this in fact constitutes an open problem about matroid theory, since it can be restated in terms of representability of identically self-dual matroids by self-dual codes. A new concept is introduced, the flat-partition, that provides a useful classification of identically self-dual matroids. Uniform identically self-dual matroids, which are known to be representable by self-dual codes, form one of the classes. It is proved that this property also holds for the family of matroids that, in a natural way, is the next class in the above classification: the identically self-dual bipartite matroids.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The security of the two party Diffie-Hellman key exchange protocol is currently based on the discrete logarithm problem (DLP). However, it can also be built upon the elliptic curve discrete logarithm problem (ECDLP). Most proposed secure group communication schemes employ the DLP-based Diffie-Hellman protocol. This paper proposes the ECDLP-based Diffie-Hellman protocols for secure group communication and evaluates their performance on wireless ad hoc networks. The proposed schemes are compared at the same security level with DLP-based group protocols under different channel conditions. Our experiments and analysis show that the Tree-based Group Elliptic Curve Diffie-Hellman (TGECDH) protocol is the best in overall performance for secure group communication among the four schemes discussed in the paper. Low communication overhead, relatively low computation load and short packets are the main reasons for the good performance of the TGECDH protocol.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The responses of carbon dioxide (CO2) and other climate variables to an emission pulse of CO2 into the atmosphere are often used to compute the Global Warming Potential (GWP) and Global Temperature change Potential (GTP), to characterize the response timescales of Earth System models, and to build reduced-form models. In this carbon cycle-climate model intercomparison project, which spans the full model hierarchy, we quantify responses to emission pulses of different magnitudes injected under different conditions. The CO2 response shows the known rapid decline in the first few decades followed by a millennium-scale tail. For a 100 Gt-C emission pulse added to a constant CO2 concentration of 389 ppm, 25 ± 9% is still found in the atmosphere after 1000 yr; the ocean has absorbed 59 ± 12% and the land the remainder (16 ± 14%). The response in global mean surface air temperature is an increase by 0.20 ± 0.12 °C within the first twenty years; thereafter and until year 1000, temperature decreases only slightly, whereas ocean heat content and sea level continue to rise. Our best estimate for the Absolute Global Warming Potential, given by the time-integrated response in CO2 at year 100 multiplied by its radiative efficiency, is 92.5 × 10−15 yr W m−2 per kg-CO2. This value very likely (5 to 95% confidence) lies within the range of (68 to 117) × 10−15 yr W m−2 per kg-CO2. Estimates for time-integrated response in CO2 published in the IPCC First, Second, and Fourth Assessment and our multi-model best estimate all agree within 15% during the first 100 yr. The integrated CO2 response, normalized by the pulse size, is lower for pre-industrial conditions, compared to present day, and lower for smaller pulses than larger pulses. In contrast, the response in temperature, sea level and ocean heat content is less sensitive to these choices. Although, choices in pulse size, background concentration, and model lead to uncertainties, the most important and subjective choice to determine AGWP of CO2 and GWP is the time horizon.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed, most of them are only useful for standard machining. However, the algorithms used for tool path computation demand a higher computation performance, which makes the implementation on many existing systems very slow or even impractical. Hardware acceleration is an incremental solution that can be cleanly added to these systems while keeping everything else intact. It is completely transparent to the user. The cost is much lower and the development time is much shorter than replacing the computers by faster ones. This paper presents an optimisation that uses a specific graphic hardware approach using the power of multi-core Graphic Processing Units (GPUs) in order to improve the tool path computation. This improvement is applied on a highly accurate and robust tool path generation algorithm. The paper presents, as a case of study, a fully implemented algorithm used for turning lathe machining of shoe lasts. A comparative study will show the gain achieved in terms of total computing time. The execution time is almost two orders of magnitude faster than modern PCs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Today's wireless networks rely mostly on infrastructural support for their operation. With the concept of ubiquitous computing growing more popular, research on infrastructureless networks have been rapidly growing. However, such types of networks face serious security challenges when deployed. This dissertation focuses on designing a secure routing solution and trust modeling for these infrastructureless networks. ^ The dissertation presents a trusted routing protocol that is capable of finding a secure end-to-end route in the presence of malicious nodes acting either independently or in collusion, The solution protects the network from active internal attacks, known to be the most severe types of attacks in an ad hoc application. Route discovery is based on trust levels of the nodes, which need to be dynamically computed to reflect the malicious behavior in the network. As such, we have developed a trust computational model in conjunction with the secure routing protocol that analyzes the different malicious behavior and quantifies them in the model itself. Our work is the first step towards protecting an ad hoc network from colluding internal attack. To demonstrate the feasibility of the approach, extensive simulation has been carried out to evaluate the protocol efficiency and scalability with both network size and mobility. ^ This research has laid the foundation for developing a variety of techniques that will permit people to justifiably trust the use of ad hoc networks to perform critical functions, as well as to process sensitive information without depending on any infrastructural support and hence will enhance the use of ad hoc applications in both military and civilian domains. ^