745 resultados para Secure protocol
Resumo:
Peer-reviewed
Resumo:
Mobile devices have become ubiquitous, allowing the integration of new information from a large range of devices. However, the development of new applications requires a powerful framework which simplifies their construction. JXME is the JXTA implementation for mobile devices using J2ME, its main value being its simplicity when creating peer-to-peer (P2P) applications on limited devices. On that regard, an issue that is becoming veryimportant in the recent times is being able to provide a security baseline to such applications. This paper analyzes the currentstate of security in JXME and proposes a simple security mechanism in order to protect JXME applications against a broad range of vulnerabilities.
Resumo:
Peer-reviewed
Resumo:
Broadcast transmission mode in ad hoc networks is critical to manage multihop routing or providing medium accesscontrol (MAC)-layer fairness. In this paper, it is shown that ahigher capacity to exchange information among neighbors may beobtained through a physical-MAC cross-layer design of the broadcastprotocol exploiting signal separation principles. Coherentdetection and separation of contending nodes is possible throughtraining sequences which are selected at random from a reducedset. Guidelines for the design of this set are derived for a lowimpact on the network performance and the receiver complexity.
Resumo:
Venäjällä uudistetaan sähkömarkkinoita. Uudistamisella pyritään vapauttamaan sähkömarkkinat ja lisäämään kilpailua energiasektorilla. Sähkömarkkinoiden vapauttamisen tarkoitus on energiasektorin hyötysuhteen nostaminen ja investointien houkutteleminen sektorille. Venäjä on ratifioinut Kioton protokollan, mikä energiasektorin kannalta on tärkeää, koska protokollan yhteistoteutusmekanismin kautta saadaan houkuteltua investointeja sektorille. Venäjän sähkömarkkinoiden vapauttamisen pitkäaikainen tähtäin on Venäjän ja Euroopan sähkömarkkinoiden integroituminen, joka tarkoittaa myös ympäristölainsäädännönyhtenäistämistä. Tämä tutkimus on osa Fortum Oyj:n tarjoamaa teknistä katselmusta Venäjällä toimivalle sähköyhtiölle, TGC-9:lle. Tässä työssä keskitytään TGC-9:n omistamien energiatuotantolaitoksien happamoitumista aiheuttaviin ilmapäästöihin ja pölypäästöihin. Tutkimuksessa pyritään myös löytämään Kioton protokollan yhteistoteutusmekanismi hyödyntämiskohteita. NOx -päästöt tulevat olemaan suurin haaste TGC-9:lle, jos ympäristöstandardit yhdenmukaistetaan. Yhteistoteutusmekanismin hyödyntämiskohteita löydettiin neljä: koksaamokaasun hyödyntäminen, maakaasun korvaaminen kuoren poltolla ja kaksi tapausta liittyen laitoksien hyötysuhteen nostamiseen.
Resumo:
BACKGROUND: Selective publication of studies, which is commonly called publication bias, is widely recognized. Over the years a new nomenclature for other types of bias related to non-publication or distortion related to the dissemination of research findings has been developed. However, several of these different biases are often still summarized by the term 'publication bias'. METHODS/DESIGN: As part of the OPEN Project (To Overcome failure to Publish nEgative fiNdings) we will conduct a systematic review with the following objectives:- To systematically review highly cited articles that focus on non-publication of studies and to present the various definitions of biases related to the dissemination of research findings contained in the articles identified.- To develop and discuss a new framework on nomenclature of various aspects of distortion in the dissemination process that leads to public availability of research findings in an international group of experts in the context of the OPEN Project.We will systematically search Web of Knowledge for highly cited articles that provide a definition of biases related to the dissemination of research findings. A specifically designed data extraction form will be developed and pilot-tested. Working in teams of two, we will independently extract relevant information from each eligible article.For the development of a new framework we will construct an initial table listing different levels and different hazards en route to making research findings public. An international group of experts will iteratively review the table and reflect on its content until no new insights emerge and consensus has been reached. DISCUSSION: Results are expected to be publicly available in mid-2013. This systematic review together with the results of other systematic reviews of the OPEN project will serve as a basis for the development of future policies and guidelines regarding the assessment and prevention of publication bias.
Resumo:
BACKGROUND: Classical disease phenotypes are mainly based on descriptions of symptoms and the hypothesis that a given pattern of symptoms provides a diagnosis. With refined technologies there is growing evidence that disease expression in patients is much more diverse and subtypes need to be defined to allow a better targeted treatment. One of the aims of the Mechanisms of the Development of Allergy Project (MeDALL,FP7) is to re-define the classical phenotypes of IgE-associated allergic diseases from birth to adolescence, by consensus among experts using a systematic review of the literature and identify possible gaps in research for new disease markers. This paper describes the methods to be used for the systematic review of the classical IgE-associated phenotypes applicable in general to other systematic reviews also addressing phenotype definitions based on evidence. METHODS/DESIGN: Eligible papers were identified by PubMed search (complete database through April 2011). This search yielded 12,043 citations. The review includes intervention studies (randomized and clinical controlled trials) and observational studies (cohort studies including birth cohorts, case-control studies) as well as case series. Systematic and non-systematic reviews, guidelines, position papers and editorials are not excluded but dealt with separately. Two independent reviewers in parallel conducted consecutive title and abstract filtering scans. For publications where title and abstract fulfilled the inclusion criteria the full text was assessed. In the final step, two independent reviewers abstracted data using a pre-designed data extraction form with disagreements resolved by discussion among investigators. DISCUSSION: The systematic review protocol described here allows to generate broad,multi-phenotype reviews and consensus phenotype definitions. The in-depth analysis of the existing literature on the classification of IgE-associated allergic diseases through such a systematic review will 1) provide relevant information on the current epidemiologic definitions of allergic diseases, 2) address heterogeneity and interrelationships and 3) identify gaps in knowledge.
Resumo:
Testaus on tänä päivänä olennainen osa tuotekehitysprosessia koko tuotteen elinkaaren ajan, myös tietoliikennetekniikassa. Tietoverkoille asetetut tiukat vaatimukset ympärivuorokautisen toimivuuden suhteen nostavatmyös niiden testauksen tason ja laadun merkitystä. Erityisesti verkkojen uudet toiminnallisuudet, joilla ei ole vielä vuosia kestäneen käytön ja kehityksen tuomaa varmuus- ja laatutasoa, tuovat haasteita testauksen toteutukselle. Televisiokuvan välityksen Internetin yli mahdollistavat ominaisuudet ovat esimerkki tällaisista toiminnallisuuksista. Tässä diplomityössä käsitellään Tellabs Oy:n tuotekehitysosastolla vuosina 2005 ja 2006 toteutetun, erään operaattorin laajakaistaliityntäverkon päivitysprojektin testausosuutta. Kattavamman tarkastelun kohteena ovat erityisesti verkkoon lisättyjen laajakaistatelevisio- eli IPTV-toiminnallisuuksien integraatio- ja systeemitestausmenetelmät.
Resumo:
Työssä kehitettin läpinäkyvä Internet Small Computer Systems Interface-verkkolevyä (iSCSI) käyttävä varmistusjärjestelmä. Verkkolevyn sisältö suojattiin asiakaspään salauskerroksella (dm-crypt). Järjestely mahdollisti sen, että verkkolevylle tallennetut varmuuskopiot pysyivät luottamuksellisina, vaikka levypalvelinta tarjoava taho oli joko epäluotettava tai suorastaan vihamielinen. Järjestelmän hyötykäyttöä varten kehitettiin helppokäyttöinen prototyyppisovellus. Järjestelmän riskit ja haavoittuvuudet käytiin läpi ja analysoitiin. Järjestelmälle tehtiin myös karkea kryptoanalyysi sen teknistenominaisuuksien pohjalta. Suorituskykymittaukset tehtiin sekä salatulle että salaamattomalle iSCSI-liikenteelle. Näistä todettiin, että salauksen vaikutus suorituskykyyn oli häviävän pieni jopa 100 megabittiä sekunnissa siirtävillä verkkonopeuksilla. Lisäksi pohdittiin teknologian muita sovelluskohteita ja tulevia tutkimusalueita.
Resumo:
Työn tavoitteena on kehittää Microsoft Excel -taulukkolaskentaohjelmaan pohjautuva arvonmääritysmalli. Mallin avulla osaketutkimusta tekevät analyytikot ja sijoittajat voivat määrittää osakkeen fundamenttiarvon. Malli kehitetään erityisesti piensijoittajien työkaluksi. Työn toisena tavoitteena on soveltaa kehitettyä arvonmääritysmallia case-yrityksenä toimivan F-Securen arvonmäärityksessä ja selvittää mallin avulla onko F-Securen osake pörssissä fundamentteihin nähden oikein hinnoiteltu. Työn teoriaosassa esitellään arvonmäärityksen käyttökohteet ja historia, arvonmääritysprosessin vaiheet (strateginen analyysi, tilinpäätösanalyysi, tulevaisuuden ennakointi, yrityksen arvon laskeminen), pääoman kustannuksen määrittäminen ja sijoittajan eri arvonmääritysmenetelmät, joita ovat diskontattuun kassavirtaan perustuvassa arvonmäärityksessä käytettävät mallit sekä suhteellisen arvonmäärityksentunnusluvut. Empiirinen osa käsittää arvonmääritysmallin kehittämisen ja rakenteen kuvauksen sekä F-Securen arvonmääritysprosessin. Vaikka F-Securen tulevaisuus näyttää varsin valoisalta, osake on hinnoiteltu markkinoilla tällä hetkellä(23.02.2006) korkeammalle kuin näihin odotuksiin nähden olisi järkevää. Eri menetelmät antavat osakkeelle arvoja 2,25 euron ja 2,97 euron väliltä. Kehitetty Excel -malli määrittää F-Securen osakkeen tavoitehinnaksi eri menetelmien mediaanina 2,29 euroa. Tutkimuksen tuloksena F-Securen osaketta voidaan pitää yliarvostettuna, sillä sen hinta pörssissä on 3,05 euroa.
Resumo:
The present work aimed at maximizing the number of plantlets obtained by the micropropagation of pineapple (Ananas comosus (L.) Merrill) cv. Pérola. Changes in benzylaminopurine (BAP) concentration, type of medium (liquid or solidified) and the type of explant in the proliferation phase were evaluated. Slips were used as the explant source, which consisted of axillary buds obtained after careful excision of the leaves. A Sterilization was done in the hood with ethanol (70%), for three minutes, followed by calcium hypochlorite (2%), for fifteen minutes, and three washes in sterile water. The explants were introduced in MS medium supplemented with 2mg L-1 BAP and maintained in a growth room at a 16h photoperiod (40 mmol.m-2.s-1), 27 ± 2ºC. After eight weeks, cultures were subcultured for multiplication in MS medium. The following treatments were tested: liquid x solidified medium with different BAP concentrations (0.0, 1.5 or 3.0 mg L-1), and the longitudinal cut, or not, of the shoot bud used as explant. The results showed that liquid medium supplemented with BAP at 1.5 mg L-1, associated with the longitudinal sectioning of the shoot bud used as explant presented the best results, maximizing shoot proliferation. On average, the best treatment would allow for an estimated production of 161,080 plantlets by the micropropagation of the axillary buds of one plant with eight slips and ten buds/slips, within a period of eight months.
Resumo:
Background Depression is one of the more severe and serious health problems because of its morbidity, disabling effects and for its societal and economic burden. Despite the variety of existing pharmacological and psychological treatments, most of the cases evolve with only partial remission, relapse and recurrence. Cognitive models have contributed significantly to the understanding of unipolar depression and its psychological treatment. However, success is only partial and many authors affirm the need to improve those models and also the treatment programs derived from them. One of the issues that requires further elaboration is the difficulty these patients experience in responding to treatment and in maintaining therapeutic gains across time without relapse or recurrence. Our research group has been working on the notion of cognitive conflict viewed as personal dilemmas according to personal construct theory. We use a novel method for identifying those conflicts using the repertory grid technique (RGT). Preliminary results with depressive patients show that about 90% of them have one or more of those conflicts. This fact might explain the blockage and the difficult progress of these patients, especially the more severe and/or chronic. These results justify the need for specific interventions focused on the resolution of these internal conflicts. This study aims to empirically test the hypothesis that an intervention focused on the dilemma(s) specifically detected for each patient will enhance the efficacy of cognitive behavioral therapy (CBT) for depression. Design A therapy manual for a dilemma-focused intervention will be tested using a randomized clinical trial by comparing the outcome of two treatment conditions: combined group CBT (eight, 2-hour weekly sessions) plus individual dilemma-focused therapy (eight, 1-hour weekly sessions) and CBT alone (eight, 2-hour group weekly sessions plus eight, 1-hour individual weekly sessions). Method Participants are patients aged over 18 years meeting diagnostic criteria for major depressive disorder or dysthymic disorder, with a score of 19 or above on the Beck depression inventory, second edition (BDI-II) and presenting at least one cognitive conflict (implicative dilemma or dilemmatic construct) as assessed using the RGT. The BDI-II is the primary outcome measure, collected at baseline, at the end of therapy, and at 3- and 12-month follow-up; other secondary measures are also used. Discussion We expect that adding a dilemma-focused intervention to CBT will increase the efficacy of one of the more prestigious therapies for depression, thus resulting in a significant contribution to the psychological treatment of depression. Trial registration ISRCTN92443999; ClinicalTrials.gov Identifier: NCT01542957.
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange
Resumo:
Wireless Sensor Networks (WSN) are formed by nodes with limited computational and power resources. WSNs are finding an increasing number of applications, both civilian and military, most of which require security for the sensed data being collected by the base station from remote sensor nodes. In addition, when many sensor nodes transmit to the base station, the implosion problem arises. Providing security measures and implosion-resistance in a resource-limited environment is a real challenge. This article reviews the aggregation strategies proposed in the literature to handle the bandwidth and security problems related to many-to-one transmission in WSNs. Recent contributions to secure lossless many-to-one communication developed by the authors in the context of several Spanish-funded projects are surveyed. Ongoing work on the secure lossy many-to-one communication is also sketched.