932 resultados para signature inversion
Resumo:
We used our TopSig open-source indexing and retrieval tool to produce runs for the ShARe/CLEF eHealth 2013 track. TopSig was used to produce runs using the query fields and provided discharge summaries, where appropriate. Although the improvement was not great TopSig was able to gain some benefit from utilising the discharge summaries, although the software needed to be modified to support this. This was part of a larger experiment involving determining the applicability and limits to signature-based approaches.
Resumo:
Clustering is an important technique in organising and categorising web scale documents. The main challenges faced in clustering the billions of documents available on the web are the processing power required and the sheer size of the datasets available. More importantly, it is nigh impossible to generate the labels for a general web document collection containing billions of documents and a vast taxonomy of topics. However, document clusters are most commonly evaluated by comparison to a ground truth set of labels for documents. This paper presents a clustering and labeling solution where the Wikipedia is clustered and hundreds of millions of web documents in ClueWeb12 are mapped on to those clusters. This solution is based on the assumption that the Wikipedia contains such a wide range of diverse topics that it represents a small scale web. We found that it was possible to perform the web scale document clustering and labeling process on one desktop computer under a couple of days for the Wikipedia clustering solution containing about 1000 clusters. It takes longer to execute a solution with finer granularity clusters such as 10,000 or 50,000. These results were evaluated using a set of external data.
Resumo:
This paper presents an overview of the strengths and limitations of existing and emerging geophysical tools for landform studies. The objectives are to discuss recent technical developments and to provide a review of relevant recent literature, with a focus on propagating field methods with terrestrial applications. For various methods in this category, including ground-penetrating radar (GPR), electrical resistivity (ER), seismics, and electromagnetic (EM) induction, the technical backgrounds are introduced, followed by section on novel developments relevant to landform characterization. For several decades, GPR has been popular for characterization of the shallow subsurface and in particular sedimentary systems. Novel developments in GPR include the use of multi-offset systems to improve signal-to-noise ratios and data collection efficiency, amongst others, and the increased use of 3D data. Multi-electrode ER systems have become popular in recent years as they allow for relatively fast and detailed mapping. Novel developments include time-lapse monitoring of dynamic processes as well as the use of capacitively-coupled systems for fast, non-invasive surveys. EM induction methods are especially popular for fast mapping of spatial variation, but can also be used to obtain information on the vertical variation in subsurface electrical conductivity. In recent years several examples of the use of plane wave EM for characterization of landforms have been published. Seismic methods for landform characterization include seismic reflection and refraction techniques and the use of surface waves. A recent development is the use of passive sensing approaches. The use of multiple geophysical methods, which can benefit from the sensitivity to different subsurface parameters, is becoming more common. Strategies for coupled and joint inversion of complementary datasets will, once more widely available, benefit the geophysical study of landforms.Three cases studies are presented on the use of electrical and GPR methods for characterization of landforms in the range of meters to 100. s of meters in dimension. In a study of polygonal patterned ground in the Saginaw Lowlands, Michigan, USA, electrical resistivity tomography was used to characterize differences in subsurface texture and water content associated with polygon-swale topography. Also, a sand-filled thermokarst feature was identified using electrical resistivity data. The second example is on the use of constant spread traversing (CST) for characterization of large-scale glaciotectonic deformation in the Ludington Ridge, Michigan. Multiple CST surveys parallel to an ~. 60. m high cliff, where broad (~. 100. m) synclines and narrow clay-rich anticlines are visible, illustrated that at least one of the narrow structures extended inland. A third case study discusses internal structures of an eolian dune on a coastal spit in New Zealand. Both 35 and 200. MHz GPR data, which clearly identified a paleosol and internal sedimentary structures of the dune, were used to improve understanding of the development of the dune, which may shed light on paleo-wind directions.
Resumo:
Electrical resistivity of soils and sediments is strongly influenced by the presence of interstitial water. Taking advantage of this dependency, electrical-resistivity imaging (ERI) can be effectively utilized to estimate subsurface soil-moisture distributions. The ability to obtain spatially extensive data combined with time-lapse measurements provides further opportunities to understand links between land use and climate processes. In natural settings, spatial and temporal changes in temperature and porewater salinity influence the relationship between soil moisture and electrical resistivity. Apart from environmental factors, technical, theoretical, and methodological ambiguities may also interfere with accurate estimation of soil moisture from ERI data. We have examined several of these complicating factors using data from a two-year study at a forest-grassland ecotone, a boundary between neighboring but different plant communities.At this site, temperature variability accounts for approximately 20-45 of resistivity changes from cold winter to warm summer months. Temporal changes in groundwater conductivity (mean=650 S/cm =57.7) and a roughly 100-S/cm spatial difference between the forest and grassland had only a minor influence on the moisture estimates. Significant seasonal fluctuations in temperature and precipitation had negligible influence on the basic measurement errors in data sets. Extracting accurate temporal changes from ERI can be hindered by nonuniqueness of the inversion process and uncertainties related to time-lapse inversion schemes. The accuracy of soil moisture obtained from ERI depends on all of these factors, in addition to empirical parameters that define the petrophysical soil-moisture/resistivity relationship. Many of the complicating factors and modifying variables to accurately quantify soil moisture changes with ERI can be accounted for using field and theoretical principles.
Resumo:
Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.
Resumo:
At CRYPTO 2006, Halevi and Krawczyk proposed two randomized hash function modes and analyzed the security of digital signature algorithms based on these constructions. They showed that the security of signature schemes based on the two randomized hash function modes relies on properties similar to the second preimage resistance rather than on the collision resistance property of the hash functions. One of the randomized hash function modes was named the RMX hash function mode and was recommended for practical purposes. The National Institute of Standards and Technology (NIST), USA standardized a variant of the RMX hash function mode and published this standard in the Special Publication (SP) 800-106. In this article, we first discuss a generic online birthday existential forgery attack of Dang and Perlner on the RMX-hash-then-sign schemes. We show that a variant of this attack can be applied to forge the other randomize-hash-then-sign schemes. We point out practical limitations of the generic forgery attack on the RMX-hash-then-sign schemes. We then show that these limitations can be overcome for the RMX-hash-then-sign schemes if it is easy to find fixed points for the underlying compression functions, such as for the Davies-Meyer construction used in the popular hash functions such as MD5 designed by Rivest and the SHA family of hash functions designed by the National Security Agency (NSA), USA and published by NIST in the Federal Information Processing Standards (FIPS). We show an online birthday forgery attack on this class of signatures by using a variant of Dean’s method of finding fixed point expandable messages for hash functions based on the Davies-Meyer construction. This forgery attack is also applicable to signature schemes based on the variant of RMX standardized by NIST in SP 800-106. We discuss some important applications of our attacks and discuss their applicability on signature schemes based on hash functions with ‘built-in’ randomization. Finally, we compare our attacks on randomize-hash-then-sign schemes with the generic forgery attacks on the standard hash-based message authentication code (HMAC).
Resumo:
Halevi and Krawczyk proposed a message randomization algorithm called RMX as a front-end tool to the hash-then-sign digital signature schemes such as DSS and RSA in order to free their reliance on the collision resistance property of the hash functions. They have shown that to forge a RMX-hash-then-sign signature scheme, one has to solve a cryptanalytical task which is related to finding second preimages for the hash function. In this article, we will show how to use Dean’s method of finding expandable messages for finding a second preimage in the Merkle-Damgård hash function to existentially forge a signature scheme based on a t-bit RMX-hash function which uses the Davies-Meyer compression functions (e.g., MD4, MD5, SHA family) in 2 t/2 chosen messages plus 2 t/2 + 1 off-line operations of the compression function and similar amount of memory. This forgery attack also works on the signature schemes that use Davies-Meyer schemes and a variant of RMX published by NIST in its Draft Special Publication (SP) 800-106. We discuss some important applications of our attack.
Resumo:
This paper presents an approach to mobile robot localization, place recognition and loop closure using a monostatic ultra-wide band (UWB) radar system. The UWB radar is a time-of-flight based range measurement sensor that transmits short pulses and receives reflected waves from objects in the environment. The main idea of the poposed localization method is to treat the received waveform as a signature of place. The resulting echo waveform is very complex and highly depends on the position of the sensor with respect to surrounding objects. On the other hand, the sensor receives similar waveforms from the same positions.Moreover, the directional characteristics of dipole antenna is almost omnidirectional. Therefore, we can localize the sensor position to find similar waveform from waveform database. This paper proposes a place recognitionmethod based on waveform matching, presents a number of experiments that illustrate the high positon estimation accuracy of our UWB radar-based localization system, and shows the resulting loop detection performance in a typical indoor office environment and a forest.
Resumo:
Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.
Resumo:
When verifying or reverse-engineering digital circuits, one often wants to identify and understand small components in a larger system. A possible approach is to show that the sub-circuit under investigation is functionally equivalent to a reference implementation. In many cases, this task is difficult as one may not have full information about the mapping between input and output of the two circuits, or because the equivalence depends on settings of control inputs. We propose a template-based approach that automates this process. It extracts a functional description for a low-level combinational circuit by showing it to be equivalent to a reference implementation, while synthesizing an appropriate mapping of input and output signals and setting of control signals. The method relies on solving an exists/forall problem using an SMT solver, and on a pruning technique based on signature computation.
Resumo:
The structures of the cocrystalline adducts of 3,5-dinitrobenzoic acid (3,5-DNBA) with 4-aminosalicylic acid (PASA), the 1:1 partial hydrate, C7H4N2O6 .C7H7NO3 . 2H2O, (I) and 2-hydroxy-3-(1H-indol-3-yl)propenoic acid (HIPA) and the 1:1:1 d6-dimethylsulfoxide solvate, C7H4N2O6 . C11H9NO3 . C2D6OS, (II) are reported. The crystal substructure of (I) comprises two centrosymmetric hydrogen-bonded R2/2(8) homodimers, one with 3,5-DNBA, the other with PASA, and an R2/2(8) 3,5-DNBA-PASA heterodimer. In the crystal, inter-unit amine N-H...O and water O-H...O hydrogen bonds generate a three-dimensional supramolecular structure. In (II), the asymmetric unit consists of the three constituent molecules which form an essentially planar cyclic hydrogen-bonded heterotrimer unit [graph set R2/3(17)] through carboxyl, hydroxy and amino groups. These units associate across a crystallographic inversion centre through the HIPA carboxylic acid group in an R2/2~(8) hydrogen-bonding association, giving a zero-dimensional structure lying parallel to (100). In both structures, pi--pi interactions are present [minimum ring centroid separations: 3.6471(18)A in (I) and 3.5819(10)A in (II)].
Resumo:
Nth-Dimensional Truncated Polynomial Ring (NTRU) is a lattice-based public-key cryptosystem that offers encryption and digital signature solutions. It was designed by Silverman, Hoffstein and Pipher. The NTRU cryptosystem was patented by NTRU Cryptosystems Inc. (which was later acquired by Security Innovations) and available as IEEE 1363.1 and X9.98 standards. NTRU is resistant to attacks based on Quantum computing, to which the standard RSA and ECC public-key cryptosystems are vulnerable to. In addition, NTRU has higher performance advantages over these cryptosystems. Considering this importance of NTRU, it is highly recommended to adopt NTRU as part of a cipher suite along with widely used cryptosystems for internet security protocols and applications. In this paper, we present our analytical study on the implementation of NTRU encryption scheme which serves as a guideline for security practitioners who are novice to lattice-based cryptography or even cryptography. In particular, we show some non-trivial issues that should be considered towards a secure and efficient NTRU implementation.
Resumo:
In recent years, the practice of contemporary dancers has altered significantly in the transition from canonical choreographic vocabularies to a proliferation of choreographic signatures within mainstream and independent dance. Dancers are often required to collaborate creatively on the formation of choreographic material, thus engaging conceptually with emerging cultural paradigms. This book explores the co-creative practice of contemporary dancers solely from the point of view of the dancer. It reveals multiple dancing perspectives, drawn from interviews, current writing and evocative accounts from inside the choreographic process, illuminating the myriad ways that dancers contribute to the production of contemporary dance culture. A key insight of the book is that a dancer's signature way of being is a 'moving identity', which incorporates past dance experience, anatomical structures and conditioned human movement as a self-in-process. The moving identity is the movement signature that the dancer forms throughout a career path.
Resumo:
As a precursor to the 2014 G20 Leaders’ Summit held in Brisbane, Australia, the Queensland Government sponsored a program of G20 Cultural Celebrations, designed to showcase the Summit’s host city. The cultural program’s signature event was the Colour Me Brisbane festival, a two-week ‘citywide interactive light and projection installations’ festival that was originally slated to run from 24 October to 9 November, but which was extended due to popular demand to conclude with the G20 Summit itself on 16 November. The Colour Me Brisbane festival comprised a series projection displays that promoted visions of the city’s past, present, and future at landmark sites and iconic buildings throughout the city’s central business district and thus transformed key buildings into forms of media architecture. In some instances the media architecture installations were interactive, allowing the public to control aspects of the projections through a computer interface situated in front of the building; however, the majority of the installations were not interactive in this sense. The festival was supported by a website that included information regarding the different visual and interactive displays and links to social media to support public discussion regarding the festival (Queensland Government 2014). Festival-goers were also encouraged to follow a walking-tour map of the projection sites that would take them on a 2.5 kilometre walk from Brisbane’s cultural precinct, through the city centre, concluding at parliament house. In this paper, we investigate the Colour Me Brisbane festival and the broader G20 Cultural Celebrations as a form of strategic placemaking—designed, on the one hand, to promote Brisbane as a safe, open, and accessible city in line with the City Council’s plan to position Brisbane as a ‘New World City’ (Brisbane City Council 2014). On the other hand, it was deployed to counteract growing local concerns and tensions over the disruptive and politicised nature of the G20 Summit by engaging the public with the city prior to the heightened security and mobility restrictions of the Summit weekend. Harnessing perspectives from media architecture (Brynskov et al. 2013), urban imaginaries (Cinar & Bender 2007), and social media analysis, we take a critical approach to analysing the government-sponsored projections, which literally projected the city onto itself, and public responses to them via the official, and heavily promoted, social media hashtags (#colourmebrisbane and #g20cultural). Our critical framework extends the concepts of urban phantasmagoria and urban imaginaries into the emerging field of media architecture to scrutinise its potential for increased political and civic engagement. Walter Benjamin’s concept of phantasmagoria (Cohen 1989; Duarte, Firmino, & Crestani 2014) provides an understanding of urban space as spectacular projection, implicated in commodity and techno-culture. The concept of urban imaginaries (Cinar & Bender 2007; Kelley 2013)—that is, the ways in which citizens’ experiences of urban environments are transformed into symbolic representations through the use of imagination—similarly provides a useful framing device in thinking about the Colour Me Brisbane projections and their relation to the construction of place. Employing these critical frames enables us to examine the ways in which the installations open up the potential for multiple urban imaginaries—in the sense that they encourage civic engagement via a tangible and imaginative experience of urban space—while, at the same time, supporting a particular vision and way of experiencing the city, promoting a commodified, sanctioned form of urban imaginary. This paper aims to dissect the urban imaginaries intrinsic to the Colour Me Brisbane projections and to examine how those imaginaries were strategically deployed as place-making schemes that choreograph reflections about and engagement with the city.
Resumo:
This project has identified a molecular signature involved in functions critical to breast cancer progression and metastasis mediated by vitronectin, an abundant protein in human plasma and victornectin:insulin-like growth factor complexes. This may have significant implications in designing future therapeutic targets for patient with tumours overexpressing vitronectin and/or the components of the insulin-like growth factor system:vitronectin axis. In particular, the findings from this project have identified Cyr61 and CTGF as key mediators involved in vitroncetin- and insulin-like growth factor I: Insulin-like growth factor-binding protein:vitronectin-induced breast cancer cell survival and migration.