988 resultados para 319-1
Resumo:
The 19 kDa carboxyl-terminal fragment of merozoite surface protein 1 (MSP119) is a major component of the invasion-inhibitory response in individual immunity to malaria. A novel ultrasonic atomization approach for the formulation of biodegradable poly(lactic-co-glycolic acid) (PLGA) microparticles of malaria DNA vaccines encoding MSP119 is presented here. After condensing the plasmid DNA (pDNA) molecules with a cationic polymer polyethylenimine (PEI), a 40 kHz ultrasonic atomization frequency was used to formulate PLGA microparticles at a flow rate of 18 mL h1. High levels of gene expression and moderate cytotoxicity in COS-7 cells were achieved with the condensed pDNA at a nitrogen to phosphate (N/P) ratio of 20, thus demonstrating enhanced cellular uptake and expression of the transgene. The ability of the microparticles to convey pDNA was examined by characterizing the formulated microparticles. The microparticles displayed Z-average hydrodynamic diameters of 1.50-2.10 lm and zeta potentials of 17.8-23.2 mV. The encapsulation efficiencies were between 78 and 83%, and 76 and 85% of the embedded malaria pDNA molecules were released under physiological conditions in vitro. These results indicate that PLGA-mediated microparticles can be employed as potential gene delivery systems to antigen-presenting cells in the prevention of malaria.
Resumo:
Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.
Resumo:
Business process models have traditionally been an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach for process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions as they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. Empirical data obtained in this study suggests that this approach may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.
Resumo:
This chapter analyses recent policy reforms in the national history curriculum in both Australia and the Russian Federation. It analyses those emphases in the national curriculum in history that depict new representations and historiography and the ways in which this is foregrounded in History school textbooks. In doing so, it considers the debates about what version of the nation’s past are deemed significant, and what should be transmitted to future generations of citizens. In this discussion of national history curricula, consideration is made of the curriculum’s officially defined status as an instrument in the process of ideological transformation, and nation-building. The chapter also examines how history textbooks are implicit in this process, in terms of reproducing and representing what content is selected and emphasised in a national history curriculum.
Resumo:
Background Premature aging syndromes recapitulate many aspects of natural aging and provide an insight into this phenomenon at a molecular and cellular level. The progeria syndromes appear to cause rapid aging through disruption of normal nuclear structure. Recently, a coding mutation (c.34G > A [p.A12T]) in the Barrier to Autointegration Factor 1 (BANF1) gene was identified as the genetic basis of Néstor-Guillermo Progeria syndrome (NGPS). This mutation was described to cause instability in the BANF1 protein, causing a disruption of the nuclear envelope structure. Results Here we demonstrate that the BANF1 A12T protein is indeed correctly folded, stable and that the observed phenotype, is likely due to the disruption of the DNA binding surface of the A12T mutant. We demonstrate, using biochemical assays, that the BANF1 A12T protein is impaired in its ability to bind DNA while its interaction with nuclear envelope proteins is unperturbed. Consistent with this, we demonstrate that ectopic expression of the mutant protein induces the NGPS cellular phenotype, while the protein localizes normally to the nuclear envelope. Conclusions Our study clarifies the role of the A12T mutation in NGPS patients, which will be of importance for understanding the development of the disease.
Resumo:
An increasing amount of people seek health advice on the web using search engines; this poses challenging problems for current search technologies. In this paper we report an initial study of the effectiveness of current search engines in retrieving relevant information for diagnostic medical circumlocutory queries, i.e., queries that are issued by people seeking information about their health condition using a description of the symptoms they observes (e.g. hives all over body) rather than the medical term (e.g. urticaria). This type of queries frequently happens when people are unfamiliar with a domain or language and they are common among health information seekers attempting to self-diagnose or self-treat themselves. Our analysis reveals that current search engines are not equipped to effectively satisfy such information needs; this can have potential harmful outcomes on people’s health. Our results advocate for more research in developing information retrieval methods to support such complex information needs.
Resumo:
This paper addresses the development of trust in the use of Open Data through incorporation of appropriate authentication and integrity parameters for use by end user Open Data application developers in an architecture for trustworthy Open Data Services. The advantages of this architecture scheme is that it is far more scalable, not another certificate-based hierarchy that has problems with certificate revocation management. With the use of a Public File, if the key is compromised: it is a simple matter of the single responsible entity replacing the key pair with a new one and re-performing the data file signing process. Under this proposed architecture, the the Open Data environment does not interfere with the internal security schemes that might be employed by the entity. However, this architecture incorporates, when needed, parameters from the entity, e.g. person who authorized publishing as Open Data, at the time that datasets are created/added.
Resumo:
Despite significant improvements in capacity-distortion performance, a computationally efficient capacity control is still lacking in the recent watermarking schemes. In this paper, we propose an efficient capacity control framework to substantiate the notion of watermarking capacity control to be the process of maintaining “acceptable” distortion and running time, while attaining the required capacity. The necessary analysis and experimental results on the capacity control are reported to address practical aspects of the watermarking capacity problem, in dynamic (size) payload embedding.
Resumo:
Process improvement and innovation are risky endeavors, like swimming in unknown waters. In this chapter, I will discuss how process innovation through BPM can benefit from Research-as-a-Service, that is, from the application of research concepts in the processes of BPM projects. A further subject will be how innovations can be converted from confidence-based to evidence-based models due to affordances of digital infrastructures such as large-scale enterprise soft-ware or social media. I will introduce the relevant concepts, provide illustrations for digital capabilities that allow for innovation, and share a number of key takeaway lessons for how organizations can innovate on the basis of digital opportunities and principles of evidence-based BPM: the foundation of all process decisions in facts rather than fiction.
Resumo:
The control of environmental factors in open-office environments, such as lighting and temperature is becoming increasingly automated. This development means that office inhabitants are losing the ability to manually adjust environmental conditions according to their needs. In this paper we describe the design, use and evaluation of MiniOrb, a system that employs ambient and tangible interaction mechanisms to allow inhabitants of office environments to maintain awareness of environmental factors, report on their own subjectively perceived office comfort levels and see how these compare to group average preferences. The system is complemented by a mobile application, which enables users to see and set the same sensor values and preferences, but using a screen-based interface. We give an account of the system’s design and outline the results of an in-situ trial and user study. Our results show that devices that combine ambient and tangible interaction approaches are well suited to the task of recording indoor climate preferences and afford a rich set of possible interactions that can complement those enabled by more conventional screen-based interfaces.