60 resultados para Sigalon, Xavier.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Process Modeling is a widely used concept for understanding, documenting and also redesigning the operations of organizations. The validation and usage of process models is however affected by the fact that only business analysts fully understand them in detail. This is in particular a problem because they are typically not domain experts. In this paper, we investigate in how far the concept of verbalization can be adapted from object-role modeling to process models. To this end, we define an approach which automatically transforms BPMN process models into natural language texts and combines different techniques from linguistics and graph decomposition in a flexible and accurate manner. The evaluation of the technique is based on a prototypical implementation and involves a test set of 53 BPMN process models showing that natural language texts can be generated in a reliable fashion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INEX investigates focused retrieval from structured documents by providing large test collections of structured documents, uniform evaluation measures, and a forum for organizations to compare their results. This paper reports on the INEX 2014 evaluation campaign, which consisted of three tracks: The Interactive Social Book Search Track investigated user information seeking behavior when interacting with various sources of information, for realistic task scenarios, and how the user interface impacts search and the search experience. The Social Book Search Track investigated the relative value of authoritative metadata and user-generated content for search and recommendation using a test collection with data from Amazon and LibraryThing, including user profiles and personal catalogues. The Tweet Contextualization Track investigated tweet contextualization, helping a user to understand a tweet by providing him with a short background summary generated from relevant Wikipedia passages aggregated into a coherent summary. INEX 2014 was an exciting year for INEX in which we for the third time ran our workshop as part of the CLEF labs. This paper gives an overview of all the INEX 2014 tracks, their aims and task, the built test-collections, the participants, and gives an initial analysis of the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of clustering a large document collection is not only challenged by the number of documents and the number of dimensions, but it is also affected by the number and sizes of the clusters. Traditional clustering methods fail to scale when they need to generate a large number of clusters. Furthermore, when the clusters size in the solution is heterogeneous, i.e. some of the clusters are large in size, the similarity measures tend to degrade. A ranking based clustering method is proposed to deal with these issues in the context of the Social Event Detection task. Ranking scores are used to select a small number of most relevant clusters in order to compare and place a document. Additionally,instead of conventional cluster centroids, cluster patches are proposed to represent clusters, that are hubs-like set of documents. Text, temporal, spatial and visual content information collected from the social event images is utilized in calculating similarity. Results show that these strategies allow us to have a balance between performance and accuracy of the clustering solution gained by the clustering method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper outlines the approach taken by the Speech, Audio, Image and Video Technologies laboratory, and the Applied Data Mining Research Group (SAIVT-ADMRG) in the 2014 MediaEval Social Event Detection (SED) task. We participated in the event based clustering subtask (subtask 1), and focused on investigating the incorporation of image features as another source of data to aid clustering. In particular, we developed a descriptor based around the use of super-pixel segmentation, that allows a low dimensional feature that incorporates both colour and texture information to be extracted and used within the popular bag-of-visual-words (BoVW) approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Initial attempts to obtain lattice based signatures were closely related to reducing a vector modulo the fundamental parallelepiped of a secret basis (like GGH [9], or NTRUSign [12]). This approach leaked some information on the secret, namely the shape of the parallelepiped, which has been exploited on practical attacks [24]. NTRUSign was an extremely efficient scheme, and thus there has been a noticeable interest on developing countermeasures to the attacks, but with little success [6]. In [8] Gentry, Peikert and Vaikuntanathan proposed a randomized version of Babai’s nearest plane algorithm such that the distribution of a reduced vector modulo a secret parallelepiped only depended on the size of the base used. Using this algorithm and generating large, close to uniform, public keys they managed to get provably secure GGH-like lattice-based signatures. Recently, Stehlé and Steinfeld obtained a provably secure scheme very close to NTRUSign [26] (from a theoretical point of view). In this paper we present an alternative approach to seal the leak of NTRUSign. Instead of modifying the lattices and algorithms used, we do a classic leaky NTRUSign signature and hide it with gaussian noise using techniques present in Lyubashevky’s signatures. Our main contributions are thus a set of strong NTRUSign parameters, obtained by taking into account latest known attacks against the scheme, a statistical way to hide the leaky NTRU signature so that this particular instantiation of CVP-based signature scheme becomes zero-knowledge and secure against forgeries, based on the worst-case hardness of the O~(N1.5)-Shortest Independent Vector Problem over NTRU lattices. Finally, we give a set of concrete parameters to gauge the efficiency of the obtained signature scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how families manage their finances represents a highly important research agenda given the recent economic climate of debt and uncertainty. To have a better understanding of the economics in domestic settings, it is very important to study the ways money and financial issues are collaboratively handled within families. Using an ethnographic approach, we studied the everyday financial practices of fifteen middle-income families. Our preliminary results show that there is a strong tendency to live frugally; that, people apply various and creative mechanisms to minimize their expenses and save money seemingly irrespectively of their income. To this end we highlight some implications for designing technologies to support household financial practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Digital transformations are not contained within the digital domain but are increasingly spilling over into the physical world. In this chapter, we analyse some of the transformations undergoing in cities today towards becoming smart cities. We offer a critique of smart cities and a way forward, divided into three parts: First, we explore the concept of Smart Citizens in terms of both localities, the move towards a hyperlocal network and also the citizen’s role in the creation and use of data. We use the ‘Smart London’ plan drawn up by the Mayor of London, as a way to illustrate our discussion. Second, we turn to the civic innovations enabled by digital transformations and their potential impact on citizens and citizenship. Specifically, we are interested in the notion of social capital as an alternative form of in-kind currency and its function as an indicator of value, in order to ask, can digital transformations give rise to ‘civic capital,’ and how can such a concept help, for instance, a local government invite more representative residents and community champions to participate in community engagement for better urban planning. Third, we introduce a hybrid, location-based game under development by design agency Preliminal Games in London, UK. This illustrative case critiques and highlights the current challenges to establishing a new economic model that bridges the digital / physical divide. The game provides a vehicle for us to explore how established principles and strategies in game design such as immersive storytelling and goal setting, can be employed to encourage players to think of the interconnections of their hybrid digital / physical environments in new ways.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The secretive 2011 Anti-Counterfeiting Trade Agreement – known in short by the catchy acronym ACTA – is a controversial trade pact designed to provide for stronger enforcement of intellectual property rights. The preamble to the treaty reads like pulp fiction – it raises moral panics about piracy, counterfeiting, organised crime, and border security. The agreement contains provisions on civil remedies and criminal offences; copyright law and trademark law; the regulation of the digital environment; and border measures. Memorably, Susan Sell called the international treaty a TRIPS Double-Plus Agreement, because its obligations far exceed those of the World Trade Organization's TRIPS Agreement 1994, and TRIPS-Plus Agreements, such as the Australia-United States Free Trade Agreement 2004. ACTA lacks the language of other international intellectual property agreements, which emphasise the need to balance the protection of intellectual property owners with the wider public interest in access to medicines, human development, and transfer of knowledge and technology. In Australia, there was much controversy both about the form and the substance of ACTA. While the Department of Foreign Affairs and Trade was a partisan supporter of the agreement, a wide range of stakeholders were openly critical. After holding hearings and taking note of the position of the European Parliament and the controversy in the United States, the Joint Standing Committee on Treaties in the Australian Parliament recommended the deferral of ratification of ACTA. This was striking as representatives of all the main parties agreed on the recommendation. The committee was concerned about the lack of transparency, due process, public participation, and substantive analysis of the treaty. There were also reservations about the ambiguity of the treaty text, and its potential implications for the digital economy, innovation and competition, plain packaging of tobacco products, and access to essential medicines. The treaty has provoked much soul-searching as to whether the Trick or Treaty reforms on the international treaty-making process in Australia have been compromised or undermined. Although ACTA stalled in the Australian Parliament, the debate over it is yet to conclude. There have been concerns in Australia and elsewhere that ACTA will be revived as a ‘zombie agreement’. Indeed, in March 2013, the Canadian government introduced a bill to ensure compliance with ACTA. Will it be also resurrected in Australia? Has it already been revived? There are three possibilities. First, the Australian government passed enhanced remedies with respect to piracy, counterfeiting and border measures in a separate piece of legislation – the Intellectual Property Laws Amendment (Raising the Bar) Act 2012 (Cth). Second, the Department of Foreign Affairs and Trade remains supportive of ACTA. It is possible, after further analysis, that the next Australian Parliament – to be elected in September 2013 – will ratify the treaty. Third, Australia is involved in the Trans-Pacific Partnership negotiations. The government has argued that ACTA should be a template for the Intellectual Property Chapter in the Trans-Pacific Partnership. The United States Trade Representative would prefer a regime even stronger than ACTA. This chapter provides a portrait of the Australian debate over ACTA. It is the account of an interested participant in the policy proceedings. This chapter will first consider the deliberations and recommendations of the Joint Standing Committee on Treaties on ACTA. Second, there was a concern that ACTA had failed to provide appropriate safeguards with respect to civil liberties, human rights, consumer protection and privacy laws. Third, there was a concern about the lack of balance in the treaty’s copyright measures; the definition of piracy is overbroad; the suite of civil remedies, criminal offences and border measures is excessive; and there is a lack of suitable protection for copyright exceptions, limitations and remedies. Fourth, there was a worry that the provisions on trademark law, intermediary liability and counterfeiting could have an adverse impact upon consumer interests, competition policy and innovation in the digital economy. Fifth, there was significant debate about the impact of ACTA on pharmaceutical drugs, access to essential medicines and health-care. Sixth, there was concern over the lobbying by tobacco industries for ACTA – particularly given Australia’s leadership on tobacco control and the plain packaging of tobacco products. Seventh, there were concerns about the operation of border measures in ACTA. Eighth, the Joint Standing Committee on Treaties was concerned about the jurisdiction of the ACTA Committee, and the treaty’s protean nature. Finally, the chapter raises fundamental issues about the relationship between the executive and the Australian Parliament with respect to treaty-making. There is a need to reconsider the efficacy of the Trick or Treaty reforms passed by the Australian Parliament in the 1990s.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the smartphone revolution, consumer-focused mobile medical applications (apps) have flooded the market without restriction. We searched the market for commercially available apps on all mobile platforms that could provide automated risk analysis of the most serious skin cancer, melanoma. We tested 5 relevant apps against 15 images of previously excised skin lesions and compared the apps' risk grades to the known histopathologic diagnosis of the lesions. Two of the apps did not identify any of the melanomas. The remaining 3 apps obtained 80% sensitivity for melanoma risk identification; specificities for the 5 apps ranged from 20%-100%. Each app provided its own grading and recommendation scale and included a disclaimer recommending regular dermatologist evaluation regardless of the analysis outcome. The results indicate that autonomous lesion analysis is not yet ready for use as a triage tool. More concerning is the lack of restrictions and regulations for these applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aerobic exercise training performed at the intensity eliciting maximal fat oxidation (Fatmax) has been shown to improve the metabolic profile of obese patients. However, limited information is available on the reproducibility of Fatmax and related physiological measures. The aim of this study was to assess the intra-individual variability of: a) Fatmax measurements determined using three different data analysis approaches and b) fat and carbohydrate oxidation rates at rest and at each stage of an individualized graded test. Fifteen healthy males [body mass index 23.1±0.6 kg/m2, maximal oxygen consumption () 52.0±2.0 ml/kg/min] completed a maximal test and two identical submaximal incremental tests on ergocycle (30-min rest followed by 5-min stages with increments of 7.5% of the maximal power output). Fat and carbohydrate oxidation rates were determined using indirect calorimetry. Fatmax was determined with three approaches: the sine model (SIN), measured values (MV) and 3rd polynomial curve (P3). Intra-individual coefficients of variation (CVs) and limits of agreement were calculated. CV for Fatmax determined with SIN was 16.4% and tended to be lower than with P3 and MV (18.6% and 20.8%, respectively). Limits of agreement for Fatmax were −2±27% of with SIN, −4±32 with P3 and −4±28 with MV. CVs of oxygen uptake, carbon dioxide production and respiratory exchange rate were <10% at rest and <5% during exercise. Conversely, CVs of fat oxidation rates (20% at rest and 24–49% during exercise) and carbohydrate oxidation rates (33.5% at rest, 8.5–12.9% during exercise) were higher. The intra-individual variability of Fatmax and fat oxidation rates was high (CV>15%), regardless of the data analysis approach employed. Further research on the determinants of the variability of Fatmax and fat oxidation rates is required.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A major challenge in human genetics is to devise a systematic strategy to integrate disease-associated variants with diverse genomic and biological data sets to provide insight into disease pathogenesis and guide drug discovery for complex traits such as rheumatoid arthritis (RA)1. Here we performed a genome-wide association study meta-analysis in a total of >100,000 subjects of European and Asian ancestries (29,880 RA cases and 73,758 controls), by evaluating ~10 million single-nucleotide polymorphisms. We discovered 42 novel RA risk loci at a genome-wide level of significance, bringing the total to 101 (refs 2, 3, 4). We devised an in silico pipeline using established bioinformatics methods based on functional annotation5, cis-acting expression quantitative trait loci6 and pathway analyses7, 8, 9—as well as novel methods based on genetic overlap with human primary immunodeficiency, haematological cancer somatic mutations and knockout mouse phenotypes—to identify 98 biological candidate genes at these 101 risk loci. We demonstrate that these genes are the targets of approved therapies for RA, and further suggest that drugs approved for other indications may be repurposed for the treatment of RA. Together, this comprehensive genetic study sheds light on fundamental genes, pathways and cell types that contribute to RA pathogenesis, and provides empirical evidence that the genetics of RA can provide important information for drug discovery.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bone mineral density (BMD) is the most widely used predictor of fracture risk. We performed the largest meta-analysis to date on lumbar spine and femoral neck BMD, including 17 genome-wide association studies and 32,961 individuals of European and east Asian ancestry. We tested the top BMD-associated markers for replication in 50,933 independent subjects and for association with risk of low-trauma fracture in 31,016 individuals with a history of fracture (cases) and 102,444 controls. We identified 56 loci (32 new) associated with BMD at genome-wide significance (P < 5 × 10−8). Several of these factors cluster within the RANK-RANKL-OPG, mesenchymal stem cell differentiation, endochondral ossification and Wnt signaling pathways. However, we also discovered loci that were localized to genes not known to have a role in bone biology. Fourteen BMD-associated loci were also associated with fracture risk (P < 5 × 10−4, Bonferroni corrected), of which six reached P < 5 × 10−8, including at 18p11.21 (FAM210A), 7q21.3 (SLC25A13), 11q13.2 (LRP5), 4q22.1 (MEPE), 2p16.2 (SPTBN1) and 10q21.1 (DKK1). These findings shed light on the genetic architecture and pathophysiological mechanisms underlying BMD variation and fracture susceptibility.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ambient ultrafine particle number concentrations (PNC) have inhomogeneous spatio-temporal distributions and depend on a number of different urban factors, including background conditions and distant sources. This paper quantitatively compares exposure to ambient ultrafine particles at urban schools in two cities in developed countries, with high insolation climatic conditions, namely Brisbane (Australia) and Barcelona (Spain). The analysis used comprehensive indoor and outdoor air quality measurements at 25 schools in Brisbane and 39 schools in Barcelona. PNC modes were analysed with respect to ambient temperature, land use and urban characteristics, combined with the measured elemental carbon concentrations, NOx (Brisbane) and NO2 (Barcelona). The trends and modes of the quantified weekday average daily cycles of ambient PNC exhibited significant differences between the two cities. PNC increases were observed during traffic rush hours in both cases. However, the mid-day peak was dominant in Brisbane schools and had the highest contribution to total PNC for both indoors and outdoors. In Barcelona, the contribution from traffic was highest for ambient PNC, while the mid-day peak had a slightly higher contribution for indoor concentrations. Analysis of the relationships between PNC and land use characteristics in Barcelona schools showed a moderate correlation with the percentage of road network area and an anti-correlation with the percentage of green area. No statistically significant correlations were found for Brisbane. Overall, despite many similarities between the two cities, school-based exposure patterns were different. The main source of ambient PNC at schools was shown to be traffic in Barcelona and mid-day new particle formation in Brisbane. The mid-day PNC peak in Brisbane could have been driven by the combined effect of background and meteorological conditions, as well as other local/distant sources. The results have implications for urban development, especially in terms of air quality mitigation and management at schools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a keyless and lightweight message transformation scheme based on the combinatorial design theory for the confidentiality of a message transmitted in multiple parts through a network with multiple independent paths, or for data stored in multiple parts by a set of independent storage services such as the cloud providers. Our combinatorial scheme disperses a message into v output parts so that (k-1) or less parts do not reveal any information about any message part, and the message can only be recovered by the party who possesses all v output parts. Combinatorial scheme generates an xor transformation structure to disperse the message into v output parts. Inversion is done by applying the same xor transformation structure on output parts. The structure is generated using generalized quadrangles from design theory which represents symmetric point and line incidence relations in a projective plane. We randomize our solution by adding a random salt value and dispersing it together with the message. We show that a passive adversary with capability of accessing (k-1) communication links or storage services has no advantage so that the scheme is indistinguishable under adaptive chosen ciphertext attack (IND-CCA2).