989 resultados para Key exchange protocols
Resumo:
BACKGROUND: Exposure of adherent cells to DNA damaging agents, such as the bacterial cytolethal distending toxin (CDT) or ionizing radiations (IR), activates the small GTPase RhoA, which promotes the formation of actin stress fibers and delays cell death. The signalling intermediates that regulate RhoA activation and promote cell survival are unknown. PRINCIPAL FINDINGS: We demonstrate that the nuclear RhoA-specific Guanine nucleotide Exchange Factor (GEF) Net1 becomes dephosphorylated at a critical inhibitory site in cells exposed to CDT or IR. Expression of a dominant negative Net1 or Net1 knock down by iRNA prevented RhoA activation, inhibited the formation of stress fibers, and enhanced cell death, indicating that Net1 activation is required for this RhoA-mediated responses to genotoxic stress. The Net1 and RhoA-dependent signals involved activation of the Mitogen-Activated Protein Kinase p38 and its downstream target MAPK-activated protein kinase 2. SIGNIFICANCE: Our data highlight the importance of Net1 in controlling RhoA and p38 MAPK mediated cell survival in cells exposed to DNA damaging agents and illustrate a molecular pathway whereby chronic exposure to a bacterial toxin may promote genomic instability.
Resumo:
The development and maintenance of the sealing of the root canal system is the key to the success of root canal treatment. The resin-based adhesive material has the potential to reduce the microleakage of the root canal because of its adhesive properties and penetration into dentinal walls. Moreover, the irrigation protocols may have an influence on the adhesiveness of resin-based sealers to root dentin. The objective of the present study was to evaluate the effect of different irrigant protocols on coronal bacterial microleakage of gutta-percha/AH Plus and Resilon/Real Seal Self-etch systems. One hundred ninety pre-molars were used. The teeth were divided into 18 experimental groups according to the irrigation protocols and filling materials used. The protocols used were: distilled water; sodium hypochlorite (NaOCl)+eDTA; NaOCl+H3PO4; NaOCl+eDTA+chlorhexidine (CHX); NaOCl+H3PO4+CHX; CHX+eDTA; CHX+ H3PO4; CHX+eDTA+CHX and CHX+H3PO4+CHX. Gutta-percha/AH Plus or Resilon/Real Seal Se were used as root-filling materials. The coronal microleakage was evaluated for 90 days against Enterococcus faecalis. Data were statistically analyzed using Kaplan-Meier survival test, Kruskal-Wallis and Mann-Whitney tests. No significant difference was verified in the groups using chlorhexidine or sodium hypochlorite during the chemo-mechanical preparation followed by eDTA or phosphoric acid for smear layer removal. The same results were found for filling materials. However, the statistical analyses revealed that a final flush with 2% chlorhexidine reduced significantly the coronal microleakage. A final flush with 2% chlorhexidine after smear layer removal reduces coronal microleakage of teeth filled with gutta-percha/AH Plus or Resilon/Real Seal SE.
Resumo:
The evaluation of hexose and pentose in pre-cultivation of Candida guilliermondii FTI 20037 yeast on xylose reductase (XR) and xylitol dehydrogenase (XDH) enzymes activities was performed during fermentation in sugarcane bagasse hemicellulosic hydrolysate. The xylitol production was evaluated by using cells previously growth in 30.0 gl(-1) xylose, 30.0 gl(-1) glucose and in both sugars mixture (30.0 gl(-1) xylose and 2.0 gl(-1) glucose). The vacuum evaporated hydrolysate (80 gl(-1)) was detoxificated by ion exchange resin (A-860S; A500PS and C-150-Purolite(A (R))). The total phenolic compounds and acetic acid were 93.0 and 64.9%, respectively, removed by the resin hydrolysate treatment. All experiments were carried out in Erlenmeyer flasks at 200 rpm, 30A degrees C. The maximum XR (0.618 Umg (Prot) (-1) ) and XDH (0.783 Umg (Prot) (-1) ) enzymes activities was obtained using inoculum previously growth in both sugars mixture. The highest cell concentration (10.6 gl(-1)) was obtained with inoculum pre-cultivated in the glucose. However, the xylitol yield and xylitol volumetric productivity were favored using the xylose as carbon source. In this case, it was observed maximum xylose (81%) and acetic acid (100%) consumption. It is very important to point out that maximum enzymatic activities were obtained when the mixture of sugars was used as carbon source of inoculum, while the highest fermentative parameters were obtained when xylose was used.
Resumo:
This paper proposes a new strategy to integrate shared resources and precedence constraints among real-time tasks, assuming no precise information on critical sections and computation times is available. The concept of bandwidth inheritance is combined with a capacity sharing and stealing mechanism to efficiently exchange bandwidth among tasks to minimise the degree of deviation from the ideal system’s behaviour caused by inter-application blocking. The proposed Capacity Exchange Protocol (CXP) is simpler than other proposed solutions for sharing resources in open real-time systems since it does not attempt to return the inherited capacity in the same exact amount to blocked servers. This loss of optimality is worth the reduced complexity as the protocol’s behaviour nevertheless tends to be fair and outperforms the previous solutions in highly dynamic scenarios as demonstrated by extensive simulations. A formal analysis of CXP is presented and the conditions under which it is possible to guarantee hard real-time tasks are discussed.
Resumo:
Dissertação para obtenção do Grau de Doutor em Engenharia Química, especialidade de Engenharia Bioquímica
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
We lay out a small open economy version of the Calvo sticky price model, and show how the equilibrium dynamics can be reduced to simple representation in domestic inflation and the output gap. We use the resulting framework to analyze the macroeconomic implications of three alternative rule-based policy regimes for the small open economy: domestic inflation and CPI-based Taylor rules, and an exchange rate peg. We show that a key difference amongthese regimes lies in the relative amount of exchange rate volatility that they entail. We also discuss a special case for which domestic inflation targeting constitutes the optimal policy, and where a simple second order approximation to the utility of the representative consumer can be derived and used to evaluate the welfare losses associated with the suboptimal rules.
Resumo:
The mitochondrial 70-kDa heat shock protein (mtHsp70), also known in humans as mortalin, is a central component of the mitochondrial protein import motor and plays a key role in the folding of matrix-localized mitochondrial proteins. MtHsp70 is assisted by a member of the 40-kDa heat shock protein co-chaperone family named Tid1 and a nucleotide exchange factor. Whereas, yeast mtHsp70 has been extensively studied in the context of protein import in the mitochondria, and the bacterial 70-kDa heat shock protein was recently shown to act as an ATP-fuelled unfolding enzyme capable of detoxifying stably misfolded polypeptides into harmless natively refolded proteins, little is known about the molecular functions of the human mortalin in protein homeostasis. Here, we developed novel and efficient purification protocols for mortalin and the two spliced versions of Tid1, Tid1-S, and Tid1-L and showed that mortalin can mediate the in vitro ATP-dependent reactivation of stable-preformed heat-denatured model aggregates, with the assistance of Mge1 and either Tid1-L or Tid1-S co-chaperones or yeast Mdj1. Thus, in addition of being a central component of the protein import machinery, human mortalin together with Tid1, may serve as a protein disaggregating machine which, for lack of Hsp100/ClpB disaggregating co-chaperones, may carry alone the scavenging of toxic protein aggregates in stressed, diseased, or aging human mitochondria.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
We present a new lab-on-a-chip system for electrophysiological measurements on Xenopus oocytes. Xenopus oocytes are widely used host cells in the field of pharmacological studies and drug development. We developed a novel non-invasive technique using immobilized non-devitellinized cells that replaces the traditional "two-electrode voltage-clamp" (TEVC) method. In particular, rapid fluidic exchange was implemented on-chip to allow recording of fast kinetic events of exogenous ion channels expressed in the cell membrane. Reducing fluidic exchange times of extracellular reagent solutions is a great challenge with these large millimetre-sized cells. Fluidic switching is obtained by shifting the laminar flow interface in a perfusion channel under the cell by means of integrated poly-dimethylsiloxane (PDMS) microvalves. Reagent solution exchange times down to 20 ms have been achieved. An on-chip purging system allows to perform complex pharmacological protocols, making the system suitable for screening of ion channel ligand libraries. The performance of the integrated rapid fluidic exchange system was demonstrated by investigating the self-inhibition of human epithelial sodium channels (ENaC). Our results show that the response time of this ion channel to a specific reactant is about an order of magnitude faster than could be estimated with the traditional TEVC technique.
Resumo:
Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange
Resumo:
En els darrers anys, la criptografia amb corbes el.líptiques ha adquirit una importància creixent, fins a arribar a formar part en la actualitat de diferents estàndards industrials. Tot i que s'han dissenyat variants amb corbes el.líptiques de criptosistemes clàssics, com el RSA, el seu màxim interès rau en la seva aplicació en criptosistemes basats en el Problema del Logaritme Discret, com els de tipus ElGamal. En aquest cas, els criptosistemes el.líptics garanteixen la mateixa seguretat que els construïts sobre el grup multiplicatiu d'un cos finit primer, però amb longituds de clau molt menor. Mostrarem, doncs, les bones propietats d'aquests criptosistemes, així com els requeriments bàsics per a que una corba sigui criptogràficament útil, estretament relacionat amb la seva cardinalitat. Revisarem alguns mètodes que permetin descartar corbes no criptogràficament útils, així com altres que permetin obtenir corbes bones a partir d'una de donada. Finalment, descriurem algunes aplicacions, com són el seu ús en Targes Intel.ligents i sistemes RFID, per concloure amb alguns avenços recents en aquest camp.
Resumo:
The control of the right application of medical protocols is a key issue in hospital environments. For the automated monitoring of medical protocols, we need a domain-independent language for their representation and a fully, or semi, autonomous system that understands the protocols and supervises their application. In this paper we describe a specification language and a multi-agent system architecture for monitoring medical protocols. We model medical services in hospital environments as specialized domain agents and interpret a medical protocol as a negotiation process between agents. A medical service can be involved in multiple medical protocols, and so specialized domain agents are independent of negotiation processes and autonomous system agents perform monitoring tasks. We present the detailed architecture of the system agents and of an important domain agent, the database broker agent, that is responsible of obtaining relevant information about the clinical history of patients. We also describe how we tackle the problems of privacy, integrity and authentication during the process of exchanging information between agents.
Resumo:
A patent foramen ovale (PFO), present in ∼40% of the general population, is a potential source of right-to-left shunt that can impair pulmonary gas exchange efficiency [i.e., increase the alveolar-to-arterial Po2 difference (A-aDO2)]. Prior studies investigating human acclimatization to high-altitude with A-aDO2 as a key parameter have not investigated differences between subjects with (PFO+) or without a PFO (PFO-). We hypothesized that in PFO+ subjects A-aDO2 would not improve (i.e., decrease) after acclimatization to high altitude compared with PFO- subjects. Twenty-one (11 PFO+) healthy sea-level residents were studied at rest and during cycle ergometer exercise at the highest iso-workload achieved at sea level (SL), after acute transport to 5,260 m (ALT1), and again at 5,260 m after 16 days of high-altitude acclimatization (ALT16). In contrast to PFO- subjects, PFO+ subjects had 1) no improvement in A-aDO2 at rest and during exercise at ALT16 compared with ALT1, 2) no significant increase in resting alveolar ventilation, or alveolar Po2, at ALT16 compared with ALT1, and consequently had 3) an increased arterial Pco2 and decreased arterial Po2 and arterial O2 saturation at rest at ALT16. Furthermore, PFO+ subjects had an increased incidence of acute mountain sickness (AMS) at ALT1 concomitant with significantly lower peripheral O2 saturation (SpO2). These data suggest that PFO+ subjects have increased susceptibility to AMS when not taking prophylactic treatments, that right-to-left shunt through a PFO impairs pulmonary gas exchange efficiency even after acclimatization to high altitude, and that PFO+ subjects have blunted ventilatory acclimatization after 16 days at altitude compared with PFO- subjects.
Resumo:
Tutkimuksen tarkoituksena oli tunnistaa nykyiset sekä potentiaaliset avainasiakkaat case yritykselle. Avainasiakkaat tunnistettiin Chevertonin tunnistamis- ja valintamatriisin avulla, jossa asiakkaan sijoittumista matriisiin arvioidaan asiakkaan houkuttelevuuden sekä toimittajan suhteellisten vahvuuksien avulla. Kriteereiksi avainasiakkaiden tunnistamiseen valittiin asiakkaan vuotuinen ostovolyymi, asiakkaan business-potentiaali sekä case-yrityksen toimittajaosuus. Asiakkaat luokiteltiin avainasiakkaisiin, kehitettäviin avainasiakkaisiin, ylläpidettäviin asiakkaisiin sekä satunnaisiin asiakkaisiin. Tutkimus tarjosi lähtökohdan case-yrityksen uusille avainasiakaspäälliköille sekä osoitti suunnan tulevaisuuden tutkimustarpeille. Aktiivisen tiedonvaihdannan kautta eri myyntikonttoreiden johtohenkilöstön sekä myös yrityksen eri funktionaalisten divisioonien välillä voidaan saavuttaa kilpailuetua kun lähestytään asiakasta toimintojaan järkiperäisesti koordinoineena toimittajana samalla kun asiakkaat keskittävät ostojaan. Jotta yrityksen tavoitteet, markkinamahdollisuudet sekä resurssit olisivat hyvin tasapainossa, tulisi myös asiakaskannattavuutta sekä asiakkaiden strategista merkittävyyttä arvioida ja mitata säännöllisesti tässä tutkimuksessa käytettyjen tunnistuskriteereiden lisäksi.