972 resultados para Bichat, Xavier, 1771-1802
Resumo:
In this survey, we review a number of the many “expressive” encryption systems that have recently appeared from lattices, and explore the innovative techniques that underpin them.
Resumo:
The invention of asymmetric encryption back in the seventies was a conceptual leap that vastly increased the expressive power of encryption of the times. For the first time, it allowed the sender of a message to designate the intended recipient in an cryptographic way, expressed as a “public key” that was related to but distinct from the “private key” that, alone, embodied the ability to decrypt. This made large-scale encryption a practical and scalable endeavour, and more than anything else—save the internet itself—led to the advent of electronic commerce as we know and practice it today.
Resumo:
Process Modeling is a widely used concept for understanding, documenting and also redesigning the operations of organizations. The validation and usage of process models is however affected by the fact that only business analysts fully understand them in detail. This is in particular a problem because they are typically not domain experts. In this paper, we investigate in how far the concept of verbalization can be adapted from object-role modeling to process models. To this end, we define an approach which automatically transforms BPMN process models into natural language texts and combines different techniques from linguistics and graph decomposition in a flexible and accurate manner. The evaluation of the technique is based on a prototypical implementation and involves a test set of 53 BPMN process models showing that natural language texts can be generated in a reliable fashion.
Resumo:
Background Household food insecurity and physical activity are each important public-health concerns in the United States, but the relation between them was not investigated thoroughly. Objective We wanted to examine the association between food insecurity and physical activity in the U.S. population. Methods Physical activity measured by accelerometry (PAM) and physical activity measured by questionnaire (PAQ) data from the NHANES 2003–2006 were used. Individuals aged <6 y or >65 y, pregnant, with physical limitations, or with family income >350% of the poverty line were excluded. Food insecurity was measured by the USDA Household Food Security Survey Module. Adjusted ORs were calculated from logistic regression to identify the association between food insecurity and adherence to the physical-activity guidelines. Adjusted coefficients were obtained from linear regression to identify the association between food insecurity with sedentary/physical-activity minutes. Results In children, food insecurity was not associated with adherence to physical-activity guidelines measured via PAM or PAQ and with sedentary minutes (P > 0.05). Food-insecure children did less moderate to vigorous physical activity than food-secure children (adjusted coefficient = −5.24, P = 0.02). In adults, food insecurity was significantly associated with adherence to physical-activity guidelines (adjusted OR = 0.72, P = 0.03 for PAM; and OR = 0.84, P < 0.01 for PAQ) but was not associated with sedentary minutes (P > 0.05). Conclusion Food-insecure children did less moderate to vigorous physical activity, and food-insecure adults were less likely to adhere to the physical-activity guidelines than those without food insecurity.
Resumo:
INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2014 evaluation campaign, which consisted of three tracks: The Interactive Social Book Search Track investigated user information seeking behavior when interacting with various sources of information, for realistic task scenarios, and how the user interface impacts search and the search experience. The Social Book Search Track investigated the relative value of authoritative metadata and user-generated content for search and recommendation using a test collection with data from Amazon and LibraryThing, including user profiles and personal catalogues. The Tweet Contextualization Track investigated tweet contextualization, helping a user to understand a tweet by providing him with a short background summary generated from relevant Wikipedia passages aggregated into a coherent summary. INEX 2014 was an exciting year for INEX in which we for the third time ran our workshop as part of the CLEF labs. This paper gives an overview of all the INEX 2014 tracks, their aims and task, the built test-collections, the participants, and gives an initial analysis of the results.
Resumo:
The problem of clustering a large document collection is not only challenged by the number of documents and the number of dimensions, but it is also affected by the number and sizes of the clusters. Traditional clustering methods fail to scale when they need to generate a large number of clusters. Furthermore, when the clusters size in the solution is heterogeneous, i.e. some of the clusters are large in size, the similarity measures tend to degrade. A ranking based clustering method is proposed to deal with these issues in the context of the Social Event Detection task. Ranking scores are used to select a small number of most relevant clusters in order to compare and place a document. Additionally,instead of conventional cluster centroids, cluster patches are proposed to represent clusters, that are hubs-like set of documents. Text, temporal, spatial and visual content information collected from the social event images is utilized in calculating similarity. Results show that these strategies allow us to have a balance between performance and accuracy of the clustering solution gained by the clustering method.
Resumo:
This paper outlines the approach taken by the Speech, Audio, Image and Video Technologies laboratory, and the Applied Data Mining Research Group (SAIVT-ADMRG) in the 2014 MediaEval Social Event Detection (SED) task. We participated in the event based clustering subtask (subtask 1), and focused on investigating the incorporation of image features as another source of data to aid clustering. In particular, we developed a descriptor based around the use of super-pixel segmentation, that allows a low dimensional feature that incorporates both colour and texture information to be extracted and used within the popular bag-of-visual-words (BoVW) approach.
Resumo:
Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.
Resumo:
Understanding how families manage their finances represents a highly important research agenda given the recent economic climate of debt and uncertainty. To have a better understanding of the economics in domestic settings, it is very important to study the ways money and financial issues are collaboratively handled within families. Using an ethnographic approach, we studied the everyday financial practices of fifteen middle-income families. Our preliminary results show that there is a strong tendency to live frugally; that, people apply various and creative mechanisms to minimize their expenses and save money seemingly irrespectively of their income. To this end we highlight some implications for designing technologies to support household financial practices.
Resumo:
Digital transformations are not contained within the digital domain but are increasingly spilling over into the physical world. In this chapter, we analyse some of the transformations undergoing in cities today towards becoming smart cities. We offer a critique of smart cities and a way forward, divided into three parts: First, we explore the concept of Smart Citizens in terms of both localities, the move towards a hyperlocal network and also the citizen’s role in the creation and use of data. We use the ‘Smart London’ plan drawn up by the Mayor of London, as a way to illustrate our discussion. Second, we turn to the civic innovations enabled by digital transformations and their potential impact on citizens and citizenship. Specifically, we are interested in the notion of social capital as an alternative form of in-kind currency and its function as an indicator of value, in order to ask, can digital transformations give rise to ‘civic capital,’ and how can such a concept help, for instance, a local government invite more representative residents and community champions to participate in community engagement for better urban planning. Third, we introduce a hybrid, location-based game under development by design agency Preliminal Games in London, UK. This illustrative case critiques and highlights the current challenges to establishing a new economic model that bridges the digital / physical divide. The game provides a vehicle for us to explore how established principles and strategies in game design such as immersive storytelling and goal setting, can be employed to encourage players to think of the interconnections of their hybrid digital / physical environments in new ways.
Resumo:
We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.
Resumo:
The secretive 2011 Anti-Counterfeiting Trade Agreement – known in short by the catchy acronym ACTA – is a controversial trade pact designed to provide for stronger enforcement of intellectual property rights. The preamble to the treaty reads like pulp fiction – it raises moral panics about piracy, counterfeiting, organised crime, and border security. The agreement contains provisions on civil remedies and criminal offences; copyright law and trademark law; the regulation of the digital environment; and border measures. Memorably, Susan Sell called the international treaty a TRIPS Double-Plus Agreement, because its obligations far exceed those of the World Trade Organization's TRIPS Agreement 1994, and TRIPS-Plus Agreements, such as the Australia-United States Free Trade Agreement 2004. ACTA lacks the language of other international intellectual property agreements, which emphasise the need to balance the protection of intellectual property owners with the wider public interest in access to medicines, human development, and transfer of knowledge and technology. In Australia, there was much controversy both about the form and the substance of ACTA. While the Department of Foreign Affairs and Trade was a partisan supporter of the agreement, a wide range of stakeholders were openly critical. After holding hearings and taking note of the position of the European Parliament and the controversy in the United States, the Joint Standing Committee on Treaties in the Australian Parliament recommended the deferral of ratification of ACTA. This was striking as representatives of all the main parties agreed on the recommendation. The committee was concerned about the lack of transparency, due process, public participation, and substantive analysis of the treaty. There were also reservations about the ambiguity of the treaty text, and its potential implications for the digital economy, innovation and competition, plain packaging of tobacco products, and access to essential medicines. The treaty has provoked much soul-searching as to whether the Trick or Treaty reforms on the international treaty-making process in Australia have been compromised or undermined. Although ACTA stalled in the Australian Parliament, the debate over it is yet to conclude. There have been concerns in Australia and elsewhere that ACTA will be revived as a ‘zombie agreement’. Indeed, in March 2013, the Canadian government introduced a bill to ensure compliance with ACTA. Will it be also resurrected in Australia? Has it already been revived? There are three possibilities. First, the Australian government passed enhanced remedies with respect to piracy, counterfeiting and border measures in a separate piece of legislation – the Intellectual Property Laws Amendment (Raising the Bar) Act 2012 (Cth). Second, the Department of Foreign Affairs and Trade remains supportive of ACTA. It is possible, after further analysis, that the next Australian Parliament – to be elected in September 2013 – will ratify the treaty. Third, Australia is involved in the Trans-Pacific Partnership negotiations. The government has argued that ACTA should be a template for the Intellectual Property Chapter in the Trans-Pacific Partnership. The United States Trade Representative would prefer a regime even stronger than ACTA. This chapter provides a portrait of the Australian debate over ACTA. It is the account of an interested participant in the policy proceedings. This chapter will first consider the deliberations and recommendations of the Joint Standing Committee on Treaties on ACTA. Second, there was a concern that ACTA had failed to provide appropriate safeguards with respect to civil liberties, human rights, consumer protection and privacy laws. Third, there was a concern about the lack of balance in the treaty’s copyright measures; the definition of piracy is overbroad; the suite of civil remedies, criminal offences and border measures is excessive; and there is a lack of suitable protection for copyright exceptions, limitations and remedies. Fourth, there was a worry that the provisions on trademark law, intermediary liability and counterfeiting could have an adverse impact upon consumer interests, competition policy and innovation in the digital economy. Fifth, there was significant debate about the impact of ACTA on pharmaceutical drugs, access to essential medicines and health-care. Sixth, there was concern over the lobbying by tobacco industries for ACTA – particularly given Australia’s leadership on tobacco control and the plain packaging of tobacco products. Seventh, there were concerns about the operation of border measures in ACTA. Eighth, the Joint Standing Committee on Treaties was concerned about the jurisdiction of the ACTA Committee, and the treaty’s protean nature. Finally, the chapter raises fundamental issues about the relationship between the executive and the Australian Parliament with respect to treaty-making. There is a need to reconsider the efficacy of the Trick or Treaty reforms passed by the Australian Parliament in the 1990s.
Resumo:
In this paper we have used simulations to make a conjecture about the coverage of a t-dimensional subspace of a d-dimensional parameter space of size n when performing k trials of Latin Hypercube sampling. This takes the form P(k,n,d,t) = 1 - e^(-k/n^(t-1)). We suggest that this coverage formula is independent of d and this allows us to make connections between building Populations of Models and Experimental Designs. We also show that Orthogonal sampling is superior to Latin Hypercube sampling in terms of allowing a more uniform coverage of the t-dimensional subspace at the sub-block size level. These ideas have particular relevance when attempting to perform uncertainty quantification and sensitivity analyses.
Resumo:
Objective. To examine whether the T cell receptor (TCR) A or TCRB loci exhibit linkage with disease in multiplex rheumatoid arthritis (RA) families. Methods. A linkage study was performed in 184 RA families from the UK Arthritis and Rheumatism Council Repository, each containing at least 1 affected sibpair. The microsatellites D14S50, TCRA, and D14S64 spanning the TCRA locus and D7S509, Vβ6.7, and D7S688 spanning the TCRB locus were used as DNA markers. The subjects were genotyped using a semiautomated polymerase chain reaction-based method. Two-point and multipoint linkage analyses were performed. Results. Nonparametric single-marker likelihood odds (LOD) scores were 0.49 (P = 0.07) for D14S50, 0.65 (P = 0.04) for TCRA, 0.07 (P = 0.29) for D14S64, 0.01 (P = 0.43) for D7S509, 0.0 (P = 0.50) for Vβ6.7, and 0.0 (P = 0.50) for D7S688. By multipoint analysis, there was no evidence of linkage at TCRB (LOD score 0), and the maximum LOD score at the TCRA locus was 0.37 (at D14S50). The presence of a susceptibility locus (LOD score < -2.0) was excluded, with lambda ≤ 1.8 at TCRA and ≤1.4 at TCRB. Conclusion. These linkage studies provide no significant evidence of a major germline-encoded TCRA or TCRB component of susceptibility to RA.
Resumo:
With the smartphone revolution, consumer-focused mobile medical applications (apps) have flooded the market without restriction. We searched the market for commercially available apps on all mobile platforms that could provide automated risk analysis of the most serious skin cancer, melanoma. We tested 5 relevant apps against 15 images of previously excised skin lesions and compared the apps' risk grades to the known histopathologic diagnosis of the lesions. Two of the apps did not identify any of the melanomas. The remaining 3 apps obtained 80% sensitivity for melanoma risk identification; specificities for the 5 apps ranged from 20%-100%. Each app provided its own grading and recommendation scale and included a disclaimer recommending regular dermatologist evaluation regardless of the analysis outcome. The results indicate that autonomous lesion analysis is not yet ready for use as a triage tool. More concerning is the lack of restrictions and regulations for these applications.