936 resultados para Cryptographic Protocols, Provable Security, ID-Based Cryptography
New developments of peace research: The impact of recent campaigns on disarmament and human security
Resumo:
The present text, based on previous work done by the authors on peace research (Grasa 1990 and 2010) and the disarmament campaigns linked to Human Security (Alcalde 2009 and 2010), has two objectives. First, to present a new agenda for peace research, based on the resolution/transformation of conflicts and the promotion of collective action in furtherance of human security and human development. Second, to focus specifically on collective action and on a positive reading of some of the campaigns that have taken place during the last decades in order to see how the experiences of such will affect the future agenda for peace research and action for peace.
Resumo:
While mobile technologies can provide great personalized services for mobile users, they also threaten their privacy. Such personalization-privacy paradox are particularly salient for context aware technology based mobile applications where user's behaviors, movement and habits can be associated with a consumer's personal identity. In this thesis, I studied the privacy issues in the mobile context, particularly focus on an adaptive privacy management system design for context-aware mobile devices, and explore the role of personalization and control over user's personal data. This allowed me to make multiple contributions, both theoretical and practical. In the theoretical world, I propose and prototype an adaptive Single-Sign On solution that use user's context information to protect user's private information for smartphone. To validate this solution, I first proved that user's context is a unique user identifier and context awareness technology can increase user's perceived ease of use of the system and service provider's authentication security. I then followed a design science research paradigm and implemented this solution into a mobile application called "Privacy Manager". I evaluated the utility by several focus group interviews, and overall the proposed solution fulfilled the expected function and users expressed their intentions to use this application. To better understand the personalization-privacy paradox, I built on the theoretical foundations of privacy calculus and technology acceptance model to conceptualize the theory of users' mobile privacy management. I also examined the role of personalization and control ability on my model and how these two elements interact with privacy calculus and mobile technology model. In the practical realm, this thesis contributes to the understanding of the tradeoff between the benefit of personalized services and user's privacy concerns it may cause. By pointing out new opportunities to rethink how user's context information can protect private data, it also suggests new elements for privacy related business models.
Resumo:
SUMMARYIntercellular communication is achieved at specialized regions of the plasma membrane by gap junctions. The proteins constituting the gap junctions are called connexins and are encoded by a family of genes highly conserved during evolution. In adult mouse, four connexins (Cxs) are known to be expressed in the vasculature: Cx37, Cx40, Cx43 and Cx45. Several recent studies have provided evidences that vascular connexins expression and blood pressure regulation are closely linked, suggesting a role for connexins in the control of blood pressure. However, the precise function that each vascular connexin plays under physiological and pathophysiological conditions is still not elucidated. In this context, this work was dedicated to evaluate the contribution of each of the four vascular connexins in the control of the vascular function and in the blood pressure regulation.In the present work, we first demonstrated that vascular connexins are differently regulated by hypertension in the mouse aorta. We also observed that endothelial connexins play a regulatory role on eNOS expression levels and function in the aorta, therefore in the control of vascular tone. Then, we demonstrated that Cx40 plays a pivotal role in the kidney by regulating the renal levels of COX-2 and nNOS, two key enzymes of the macula densa known to participate in the control of renin secreting cells. We also found that Cx43 forms the functional gap junction involved in intercellular Ca2+ wave propagation between vascular smooth muscle cells. Finally, we have started to generate transgenic mice expressing specifically Cx40 in the endothelium to investigate the involvement of Cx40 in the vasomotor tone, or in the renin secreting cells to evaluate the role of Cx40 in the control of renin secretion.In conclusion, this work has allowed us to identify new roles for connexins in the vasculature. Our results suggest that vascular connexins could be interesting targets for new therapies caring hypertension and vascular diseases.
Resumo:
Background Despite use in clinical practice and trials of thrombolysis, a non-contrast CT is not sensitive for identifying penumbral tissue in acute stroke. This study evaluated how it compares with physiological imaging using CT perfusion.Methods 40 imaging datasets with non-contrast CT (NCCT) and perfusion CT (CTP) were retrospectively identified. 2 sets of observers (n¼6) and a neuroradiologist made a blind evaluation of the images. Inter-observer agreement was calculated for identifying ischaemic change on NCCT, and abnormalities on cerebral blood flow, time to peak and cerebral blood volume maps. A prospective cohort of 73 patients with anterior circulation cortical strokes were thrombolysed based on qualitative assessment of penumbral tissue on CTP within 3 h of stroke onset. Functional outcome was assessed at 3 months.Results Inter-rater agreement was moderate (k¼0.54) for early ischaemic change on NCCT. Perfusion maps improved this to substantial for deficit in cerebral blood volume (k¼0.67) and almost perfect for time to peak and cerebral blood flow (both k¼0.87). In the prospective arm, 58.9% of patients with cortical strokes were thrombolysed. There was no significant difference in attainment of complete recovery (p¼0.184) between the thrombolysed and nonthrombolysed group.Conclusions We demonstrate how perfusion CT aids clinical decision- making in acute stroke. Good functional outcomes from thrombolysis can be safely achieved using this physiologically informed approach.
Resumo:
RATIONALE AND OBJECTIVES: Dose reduction may compromise patients because of a decrease of image quality. Therefore, the amount of dose savings in new dose-reduction techniques needs to be thoroughly assessed. To avoid repeated studies in one patient, chest computed tomography (CT) scans with different dose levels were performed in corpses comparing model-based iterative reconstruction (MBIR) as a tool to enhance image quality with current standard full-dose imaging. MATERIALS AND METHODS: Twenty-five human cadavers were scanned (CT HD750) after contrast medium injection at different, decreasing dose levels D0-D5 and respectively reconstructed with MBIR. The data at full-dose level, D0, have been additionally reconstructed with standard adaptive statistical iterative reconstruction (ASIR), which represented the full-dose baseline reference (FDBR). Two radiologists independently compared image quality (IQ) in 3-mm multiplanar reformations for soft-tissue evaluation of D0-D5 to FDBR (-2, diagnostically inferior; -1, inferior; 0, equal; +1, superior; and +2, diagnostically superior). For statistical analysis, the intraclass correlation coefficient (ICC) and the Wilcoxon test were used. RESULTS: Mean CT dose index values (mGy) were as follows: D0/FDBR = 10.1 ± 1.7, D1 = 6.2 ± 2.8, D2 = 5.7 ± 2.7, D3 = 3.5 ± 1.9, D4 = 1.8 ± 1.0, and D5 = 0.9 ± 0.5. Mean IQ ratings were as follows: D0 = +1.8 ± 0.2, D1 = +1.5 ± 0.3, D2 = +1.1 ± 0.3, D3 = +0.7 ± 0.5, D4 = +0.1 ± 0.5, and D5 = -1.2 ± 0.5. All values demonstrated a significant difference to baseline (P < .05), except mean IQ for D4 (P = .61). ICC was 0.91. CONCLUSIONS: Compared to ASIR, MBIR allowed for a significant dose reduction of 82% without impairment of IQ. This resulted in a calculated mean effective dose below 1 mSv.
Resumo:
Rapid diagnostic tests (RDT) are sometimes recommended to improve the home-based management of malaria. The accuracy of an RDT for the detection of clinical malaria and the presence of malarial parasites has recently been evaluated in a high-transmission area of southern Mali. During the same study, the cost-effectiveness of a 'test-and-treat' strategy for the home-based management of malaria (based on an artemisinin-combination therapy) was compared with that of a 'treat-all' strategy. Overall, 301 patients, of all ages, each of whom had been considered a presumptive case of uncomplicated malaria by a village healthworker, were checked with a commercial RDT (Paracheck-Pf). The sensitivity, specificity, and positive and negative predictive values of this test, compared with the results of microscopy and two different definitions of clinical malaria, were then determined. The RDT was found to be 82.9% sensitive (with a 95% confidence interval of 78.0%-87.1%) and 78.9% (63.9%-89.7%) specific compared with the detection of parasites by microscopy. In the detection of clinical malaria, it was 95.2% (91.3%-97.6%) sensitive and 57.4% (48.2%-66.2%) specific compared with a general practitioner's diagnosis of the disease, and 100.0% (94.5%-100.0%) sensitive but only 30.2% (24.8%-36.2%) specific when compared against the fulfillment of the World Health Organization's (2003) research criteria for uncomplicated malaria. Among children aged 0-5 years, the cost of the 'test-and-treat' strategy, per episode, was about twice that of the 'treat-all' (U.S.$1.0. v. U.S.$0.5). In older subjects, however, the two strategies were equally costly (approximately U.S.$2/episode). In conclusion, for children aged 0-5 years in a high-transmission area of sub-Saharan Africa, use of the RDT was not cost-effective compared with the presumptive treatment of malaria with an ACT. In older patients, use of the RDT did not reduce costs. The question remains whether either of the strategies investigated can be made affordable for the affected population.
Resumo:
Extranodal NK/T-cell lymphoma, nasal type, is a rare and highly aggressive disease with a grim prognosis. No therapeutic strategy is currently identified in relapsing patients. We report the results of a French prospective phase II trial of an L-asparaginase-containing regimen in 19 patients with relapsed or refractory disease treated in 13 centers. Eleven patients were in relapse and 8 patients were refractory to their first line of treatment. L-Asparaginase-based treatment yielded objective responses in 14 of the 18 evaluable patients after 3 cycles. Eleven patients entered complete remission (61%), and only 4 of them relapsed. The median overall survival time was 1 year, with a median response duration of 12 months. The main adverse events were hepatitis, cytopenia, and allergy. The absence of antiasparaginase antibodies and the disappearance of Epstein-Barr virus serum DNA were significantly associated with a better outcome. These data confirm the excellent activity of L-asparaginase-containing regimens in extranodal NK/T-cell lymphoma. L-Asparaginase-based treatment should thus be considered for salvage therapy, especially in patients with disseminated disease. First-line L-asparaginase combination therapy for extranodal NK/T-cell lymphoma warrants evaluation in prospective trials. This trial is registered at www.clinicaltrials.gov as #NCT00283985.
Resumo:
Most life science processes involve, at the atomic scale, recognition between two molecules. The prediction of such interactions at the molecular level, by so-called docking software, is a non-trivial task. Docking programs have a wide range of applications ranging from protein engineering to drug design. This article presents SwissDock, a web server dedicated to the docking of small molecules on target proteins. It is based on the EADock DSS engine, combined with setup scripts for curating common problems and for preparing both the target protein and the ligand input files. An efficient Ajax/HTML interface was designed and implemented so that scientists can easily submit dockings and retrieve the predicted complexes. For automated docking tasks, a programmatic SOAP interface has been set up and template programs can be downloaded in Perl, Python and PHP. The web site also provides an access to a database of manually curated complexes, based on the Ligand Protein Database. A wiki and a forum are available to the community to promote interactions between users. The SwissDock web site is available online at http://www.swissdock.ch. We believe it constitutes a step toward generalizing the use of docking tools beyond the traditional molecular modeling community.
Resumo:
BACKGROUND: A growing number of patients with chronic hepatitis B is being treated for extended periods with nucleoside and/or nucleotide analogs. In this context, antiviral resistance represents an increasingly common and complex issue. METHODS: Mutations in the hepatitis B virus (HBV) reverse transcriptase (rt) gene and viral genotypes were determined by direct sequencing of PCR products and alignment with reference sequences deposited in GenBank. RESULTS: Plasma samples from 60 patients with chronic hepatitis B were analyzed since March 2009. The predominant mutation pattern identified in patients with virological breakthrough was rtM204V/I ± different compensatory mutations, conferring resistance to L-nucleosides (lamivudine, telbivudine, emtricitabine) and predisposing to entecavir resistance (n = 18). Complex mutation patterns with a potential for multidrug resistance were identified in 2 patients. Selection of a fully entecavir resistant strain was observed in a patient exposed to lamivudine alone. Novel mutations were identified in 1 patient. Wild-type HBV was identified in 9 patients with suspected virological breakthrough, raising concerns about treatment adherence. No preexisting resistance mutations were identified in treatment-naïve patients (n = 13). Viral genome amplification and sequencing failed in 16 patients, of which only 2 had a documented HBV DNA > 1000 IU/ml. HBV genotypes were D in 28, A in 6, B in 4, C in 3 and E in 3 patients. Results will be updated in August 2010 and therapeutic implications discussed. CONCLUSIONS: With expanding treatment options and a growing number of patients exposed to nucleoside and/or nucleotide analogs, sequence-based HBV antiviral resistance testing is expected to become a cornerstone in the management of chronic hepatitis B.
Resumo:
Historically, it has been difficult to monitor the acute impact of anticancer therapies on hematopoietic organs on a whole-body scale. Deeper understanding of the effect of treatments on bone marrow would be of great potential value in the rational design of intensive treatment regimens. 3'-deoxy-3'-(18)F-fluorothymidine ((18)F-FLT) is a functional radiotracer used to study cellular proliferation. It is trapped in cells in proportion to thymidine-kinase 1 enzyme expression, which is upregulated during DNA synthesis. This study investigates the potential of (18)F-FLT to monitor acute effects of chemotherapy on cellular proliferation and its recovery in bone marrow, spleen, and liver during treatment with 2 different chemotherapy regimens.
Resumo:
Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.