922 resultados para Key Agreement, Password Authentication, Three-party
Resumo:
Russian e-commerce has drawn a lot of attention in academic publications as well as in public media lately. The market is growing with a speed of rocket offering companies enormous business opportunities. However, combination of Russian culture, consumer online purchasing and decision-making process forms a subject yet completely unknown. From international online business perspective it is vital for companies to know how a culture affects consumer’s purchase decision and what are the key elements that need to be modified in order to successfully expand online operations to Russian market. Here, the main purpose is to define the key factors affecting Russian consumer’s online purchase intention. In order to answer the main research question, firstly the role of culture in purchase intention context is researched. Secondly, focus is drawn to the factors that affect online purchase intention. Lastly, it is researched how Russian culture is affecting e-commerce attributes modification. The objective is not only to expand the theoretical understanding of the subject but also to provide companies with a clear vision on how the online operations should be conducted in Russian online market. Ranganathan’s and Jha’s conceptual framework was chosen as a ground theory in this study. Here, three main categories Computer Self-Efficacy and Past Online Experience, Website Quality and Costumer Concerns form basis for the study. Various articles and academic literature supplemented this theoretical approach. Qualitative research method was adapted and the study was conducted through five expert interviews. As a conclusion it can be stated that culture forms a ground for entire purchase decision-making process in online context. Results from the interviews were grouped according to main three theoretic categories and placed with Ranganathan’s and Jha’s original framework. This formed a new theoretical framework that defines diverse factors affecting specifically Russian consumer’s online purchase intention. This study suggests that following factors need to be taking into serious consideration in Russian online context: photography style, detailed product and company information, colors, language, product variety, reviews, recommendations, strong social media presence, fast check out and minimalistic order information, counterfeit fear, cash on delivery payment, training and guidance, extensive customer service, consumers’ insecurity, inexperience, high interest for technology and their individualistic personality.
Resumo:
Observations of Caspian Sea during August - September 1995 are used to develop a three dimensional numerical for calculating temperature and current. This period was chosen because of extensive set of observational data including surface temperature observations. Data from the meteorological buoy network on Caspian Sea are combined with routine observation at first order synoptic station around the lake to obtain hourly values of wind stress and pressure fields. Initial temperature distribution as a function of depth and horizontal coordinates are derived from ship cruises. The model has variable grid resolution and horizontal smoothing which filters out small scale vertical motion. The hydrodynamic model of Caspian Sea has 6 vertical levels and a uniform horizontal grid size of 50 km The model is driven with surface fluxes of heat and momentum derived from observed meteorological. The model was able to reproduce all of the basic feature of the thermal structure in Caspian sea and: larger scale circulation patterns tend to be cyclone, with cyclone circulation with each sub basin. Result has agreement with observations.
Resumo:
Although large-scale public hypermedia structures such as the World Wide Web are popularly referred to as "cyberspace", the extent to which they constitute a space in the everyday sense of the word is questionable. This paper reviews recent work in the area of three dimensional (3D) visualization of the Web that has attempted to depict it in the form of a recognizable space; in other words, as a navigable landscape that may be visibly populated by its users. Our review begins by introducing a range of visualizations that address different aspects of using the Web. These include visualizations of Web structure, especially of links, that act as 3D maps; browsing history; searches; evolution of the Web; and the presence and activities of multiple users. We then summarize the different techniques that are employed by these visualizations. We conclude with a discussion of key challenges for the future.
Resumo:
The aim of this study was to assess the relative contribution of natural productivity and compound food to the growth of the juvenile blue shrimp Litopenaeus stylirostris reared in a biofloc system. Two experiments were carried out based on the same protocol with three treatments: clear water with experimental diet (CW), biofloc with experimental diet (BF) and biofloc unfed (BU). Shrimp survival was significantly higher in biofloc rearing than in CW rearing. The contribution of the biofloc to shrimp diet was estimated through measurement of carbon and nitrogen stable isotope ratios in shrimp and food sources. Different isotopic compositions between feeds were obtained by feeding natural productivity with a mixture rich in fish meal and the shrimps with a pellet containing a high level of soy protein concentrate. Using a two source one-isotope mixing model, we found that the natural productivity of the biofloc system contributed to shrimp growth at a level of 39.8% and 36.9%, for C and N, respectively. The natural food consumed by the shrimps reared in the biofloc system resulted in higher gene expression (mRNA transcript abundance) and activities of two digestive enzymes in their digestive gland: α-amylase and trypsin. The growth of shrimp biomass reared in biofloc was, on average, 4.4 times that of those grown in clear water. Our results confirmed the best survival and promoted growth of shrimps using biofloc technology and highlighted the key role of the biofloc in the nutrition of rearing shrimps. Statement of relevance In this study, we have applied an original protocol to determine the respective contribution of natural productivity and artificial feeds on the alimentation of the juvenile blue shrimp L. stylirostris reared in biofloc system by using C and N natural stable isotope analysis. Moreover, we have compared, in shrimp digestive gland, the α-amylase and trypsin enzyme activities at biochemical and molecular levels for two different shrimp rearing systems, biofloc and clear water. In our knowledge, the use of molecular tool to study the influence of biofloc consumption on digest process of shrimp was never carried out. We think that our research is new and important to increase knowledge on biofloc topic.
Resumo:
Changes in the circumstances of the Australian pineapple industry left growers with a leadership vacuum, limited technical support and no funds for conducting research and marketing. Inspirational leadership training together with regular district farm meetings were used to assist the Australian pineapple industry to successfully adapt to these challenges. All growers were assigned to one of a number of regional grower study groups and regular on-farm meetings commenced to facilitate communication between growers, transfer of technology, awareness of industry affairs and an opportunity to become involved in industry business. A leader was appointed within each study group and these leaders attended a leadership course consisting of three, three-day modules. These original course graduates formed the nucleus of a new grower representative group which subsequently instigated levies to fund research and marketing. Two more courses have since been conducted to provide the depth of leadership to satisfy the growers' desire to rotate industry leadership on a regular basis.
Resumo:
Apparitions of empire and imperial ideologies were deeply embedded in the International Exhibition, a distinct exhibitionary paradigm that came to prominence in the mid-nineteenth century. Exhibitions were platforms for the display of objects, the movement of people, and the dissemination of ideas across and between regions of the British Empire, thereby facilitating contact between its different cultures and societies. This thesis aims to disrupt a dominant understanding of International Exhibitions, which forwards the notion that all exhibitions, irrespective of when or where they were staged, upheld a singular imperial discourse (i.e. Greenhalgh 1988, Rydell 1984). Rather, this thesis suggests International Exhibitions responded to and reflected the unique social, political and economic circumstances in which they took place, functioning as cultural environments in which pressing concerns of the day were worked through. Understood thus, the International Exhibition becomes a space for self-presentation, serving as a stage from which a multitude of interests and identities were constructed, performed and projected. This thesis looks to the visual and material culture of the International Exhibition in order to uncover this more nuanced history, and foregrounds an analysis of the intersections between practices of exhibition-making and identity-making. The primary focus is a set of exhibitions held in Glasgow in the late-1880s and early-1900s, which extends the geographic and temporal boundaries of the existing scholarship. What is more, it looks at representations of Canada at these events, another party whose involvement in the International Exhibition tradition has gone largely unnoticed. Consequently, this thesis is a thematic investigation of the links between a municipality routinely deemed the ‘Second City of the Empire’ and a Dominion settler colony, two types of geographic setting rarely brought into dialogue. It analyses three key elements of the exhibition-making process, exploring how iconographies of ‘quasi-nationhood’ were expressed through an exhibition’s planning and negotiation, its architecture and its displays. This original research framework deliberately cuts across strata that continue to define conceptions of the British Empire, and pushes beyond a conceptual model defined by metropole and colony. Through examining International Exhibitions held in Glasgow in the late-Victorian and Edwardian periods, and visions of Canada in evidence at these events, the goal is to offer a novel intervention into the existing literature concerning the cultural history of empire, one that emphasises fluidity rather than fixity and which muddles the boundaries between centre and periphery.
Resumo:
Observational studies in the field of sport are complicated by the added difficulty of having to analyse multiple, complex events or behaviours that may last just a fraction of a second. In this study, we analyse three aspects related to the reliability of data collected in such a study. The first aim was to analyse and compare the reliability of data sets assessed quantitatively (calculation of kappa statistic) and qualitatively (consensus agreement method). The second aim was to describe how, by ensuring the alignment of events, we calculated the kappa statistic for the order parameter using SDIS-GSEQ software (version 5.1) for data sets containing different numbers of sequences. The third objective was to describe a new consultative procedure designed to remove the confusion generated by discordant data sets and improve the reliability of the data. The procedure is called "consultative" because it involves the participation of a new observer who is responsible for consulting the existing observations and deciding on the definitive result.
Resumo:
When multiple third-parties (states, coalitions, and international organizations) intervene in the same conflict, do their efforts inform one another? Anecdotal evidence suggests such a possibility, but research to date has not attempted to model this interdependence directly. The current project breaks with that tradition. In particular, it proposes three competing explanations of how previous intervention efforts affect current intervention decisions: a cost model (and a variant on it, a limited commitments model), a learning model, and a random model. After using a series of Markov transition (regime-switching) models to evaluate conflict management behavior within militarized interstate disputes in the 1946-2001 period, this study concludes that third-party intervention efforts inform one another. More specifically, third-parties examine previous efforts and balance their desire to manage conflict with their need to minimize intervention costs (the cost and limited commitments models). As a result, third-parties intervene regularly using verbal pleas and mediation, but rely significantly less frequently on legal, administrative, or peace operations strategies. This empirical threshold to the intervention costs that third-parties are willing to bear has strong theoretical foundations and holds across different time periods and third-party actors. Furthermore, the analysis indicates that the first third-party to intervene in a conflict is most likely to use a strategy designed to help the disputants work toward a resolution of their dispute. After this initial intervention, the level of third-party involvement declines and often devolves into a series of verbal pleas for peace. Such findings cumulatively suggest that disputants hold the key to effective conflict management. If the disputants adopt and maintain an extreme bargaining position or fail to encourage third-parties to accept greater intervention costs, their dispute will receive little more than verbal pleas for negotiations and peace.
Resumo:
At the first full conference of the European Academy of Occupational Health Psychology (Lund, 1999), the decision was ratified to organise activities around three fora. These together represented the pillars on which the European Academy had been founded that same year: education, research and professional practice. Each forum was convened by a chair person and a small group of full members; it was agreed that a forum meeting would take place at each full conference and working groups would be established to move developments forward between conferences. The forum system has proven an effective means by which to channel the energies of individual members, and the institutions that they represent, towards advancements in all three areas of activity in occupational health psychology (OHP) in Europe. During the meeting of the education forum at the third full European Academy conference (Barcelona, 2001), the proposal was made for the establishment of a working party that would be tasked with the production of a strategy document on The Promotion of Education in Occupational Health Psychology in Europe. The proposal was ratified at the subsequent annual business meeting held during the same conference. The draft outline of the strategy document was published for consultation in the European Academy’s e-newsletter (Vol. 3.1, 2002) and the final document presented to the meeting of the education forum at the fourth full conference (Vienna, 2002). The strategy document constituted a seminal piece of literature in so far as it provided a foundation and structure capable of guiding pan-European developments in education in OHP – developments that would ensure the sustained growth of the discipline and assure it of a long-standing embedded place in both the scholarly and professional domains. To these ends, the strategy document presented six objectives as important for the sustained expansion and the promotion of education in the discipline in Europe. Namely, the development of: [1] A core syllabus for education in occupational health psychology [2] A mechanism for identifying, recognising and listing undergraduate and postgraduate modules and courses (programmes) in occupational health psychology [3] Structures to support the extension of the current provision of education in occupational health psychology [4] Ways of enhancing convergence of the current provision of education in occupational health psychology [5] Ways of encouraging regional cooperation between education providers across the regions of Europe [6] Ways of ensuring consistency with North American developments in education and promoting world wide co-operation in education Five years has elapsed since the presentation of these laudable objectives to the meeting of the education forum in Vienna in December 2002. In that time OHP has undergone considerable growth, particularly in Europe and North America. Expansion has been reflected in the evolution of existing, and emergence of new, representative bodies for the discipline on both sides of the Atlantic Ocean. As such, it might be considered timely to pause to reflect on what has been achieved in respect of each of the objectives set out in the strategy document. The current chapter examines progress on the six objectives and considers what remains to be done. This exercise is entered into not merely in order to congratulate achievements in some areas and lament slow progress in others. Rather, on the one hand it serves to highlight areas where real progress has been made with a view to the presentation of these areas as ripe for further capitalisation. On the other hand it serves to direct the attention of stakeholders (all those with a vested interest in OHP) to those key parts of the jigsaw puzzle that is the development of a self-sustaining pan-European education framework which remain to be satisfactorily addressed.
Resumo:
Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.
Resumo:
International audience
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Faculdade de Tecnologia, Departamento de Engenharia Elétrica, 2016.
Resumo:
Part 21: Mobility and Logistics
Resumo:
The inclusion of non-ipsative measures of party preference (in essence ratings for each of the parties of a political system) has become established practice in mass surveys conducted for election studies. They exist in different forms, known as thermometer ratings or feeling scores, likes and dislikes scores, or support propensities. Usually only one of these is included in a single survey, which makes it difficult to assess the relative merits of each. The questionnaire of the Irish National Election Study 2002 (INES2002) contained three different batteries of non-ipsative party preferences. This paper investigates some of the properties of these different indicators. We focus in particular on two phenomena. First, the relationship between non-ipsative preferences and the choices actually made on the ballot. In Ireland this relationship is more revealing than in most other countries owing to the electoral system (STV) which allows voters to cast multiple ordered votes for candidates from different parties. Second, we investigate the latent structure of each of the batteries of party preferences and the relationships between them. We conclude that the three instruments are not interchangeable, that they measure different orientations, and that one –the propensity to vote for a party– is by far preferable if the purpose of the study is the explanation of voters’ actual choice behaviour. This finding has important ramifications for the design of election study questionnaires.
Resumo:
Background: Noninvasive transcutaneous carbon dioxide monitoring has been shown to be accurate in infants and children, limited data are available to show the usefulness and limitations of partial transcutaneous carbon dioxide tension (PtCO2) value. Objectives: The current study prospectively determines the effectiveness and accuracy of PtCO2 measurements in newborns. Materials and Methods: Venous blood gas sampling and monitoring of the PtCO2 level (TCM TOSCA, Radiometer) were done simultaneously. All measurements are performed on mechanically ventilated infants. Partial venous carbon dioxide tension (PvCO2) values divided into three groups according to hypocapnia (Group 1: < 4.68 kPa), normocapnia (Group 2: 4.68–7.33 kPa), hypercapnia (Group 3: > 7.33 kPa) and then PvCO2 and PtCO2 data within each group were compared separately. Results: A total of 168 measurements of each PvCO2 and PtCO2 data were compared in three separated groups simultaneously (13 in Group 1, 118 in Group 2, and 37 in Group 3). A bias of more than ± 0.7 kPa was considered unacceptable. PtCO2 was related to PvCO2 with acceptable results between the two measurements in hypocapnia (mean difference 0.20 ± 0.19 kPa) and normocapnia (0.002 ± 0.30 kPa) groups. On the other hand in hypercapnia group PtCO2 values were statistically significant (P < 0.001) and lower than PvCO2 data (mean difference 0.81 ± 1.19 kPa) Conclusions: PtCO2 measurements have generally good agreement with PvCO2 in hypocapnic and normocapnic intubated infants but there are some limitations especially with high level of CO2 tension. Monitoring of PtCO2 is generally a useful non-invasive indicator of PvCO2 in hypocapnic and normocapnic infants.