902 resultados para GENERIC SIMPLICITY


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present efficient protocols for private set disjointness tests. We start from an intuition of our protocols that applies Sylvester matrices. Unfortunately, this simple construction is insecure as it reveals information about the cardinality of the intersection. More specifically, it discloses its lower bound. By using the Lagrange interpolation we provide a protocol for the honest-but-curious case without revealing any additional information. Finally, we describe a protocol that is secure against malicious adversaries. The protocol applies a verification test to detect misbehaving participants. Both protocols require O(1) rounds of communication. Our protocols are more efficient than the previous protocols in terms of communication and computation overhead. Unlike previous protocols whose security relies on computational assumptions, our protocols provide information theoretic security. To our knowledge, our protocols are first ones that have been designed without a generic secure function evaluation. More importantly, they are the most efficient protocols for private disjointness tests for the malicious adversary case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Business Process Management domain has evolved at a dramatic pace over the past two decades and the notion of the business process has become a ubiquitous part of the modern business enterprise. Most organizations now view their operations in terms of business processes and manage these business processes in the same way as other corporate assets. In recent years, an increasingly broad range of generic technology has become available for automating business processes. This is part of a growing trend in the software engineering field throughout the past 40 years, where aspects of functionality that are potentially reusable on a widespread basis have coalesced into generic software components. Figure 2.1 illustrates this trend and shows how software systems have evolved from the monolithic applications of the 1960s developed in their entirety often by a single development team to today’s offerings that are based on the integration of a range of generic technologies with only a small component of the application actually being developed from scratch. In the 1990s, generic functionality for the automation of business processes first became commercially available in the form of workflow technology and subsequently evolved in the broader field of business process management systems (BPMS). This technology alleviated the necessity to develop process support within applications from scratch and provided a variety of off-the-shelf options on which these requirements could be based. The demand for this technology was significant and it is estimated that by 2000 there were well over 200 distinct workflow offerings in the market, each with a distinct conceptual foundation. Anticipating the difficulties that would be experienced by organizations seeking to utilize and integrate distinct workflow offerings, the Workflow Management Coalition (WfMC), an industry group formed to advance technology in this area, proposed a standard reference model for workflow technology with an express desire to seek a common platform for achieving workflow interoperation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although there was substantial research into the occupational health and safety sector over the past forty years, this generally focused on statistical analyses of data related to costs and/or fatalities and injuries. There is a lack of mathematical modelling of the interactions between workers and the resulting safety dynamics of the workplace. There is also little work investigating the potential impact of different safety intervention programs prior to their implementation. In this article, we present a fundamental, differential equation-based model of workplace safety that treats worker safety habits similarly to an infectious disease in an epidemic model. Analytical results for the model, derived via phase plane and stability analysis, are discussed. The model is coupled with a model of a generic safety strategy aimed at minimising unsafe work habits, to produce an optimal control problem. The optimal control model is solved using the forward-backward sweep numerical scheme implemented in Matlab.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We provide the first molecular phylogeny of the clerid lineage (Coleoptera: Cleridae, Thanerocleridae) within the superfamily Cleroidea to examine the two most recently-proposed hypotheses of higher-level classification. Phylogenetic relationships of checkered beetles were inferred from approximately ~5,000nt of both nuclear and mitochondrial rDNA (28S, 16S, and 12S) and the mitochondrial protein-coding gene COI. A worldwide sample of ~70 genera representing almost a quarter of generic diversity of the clerid lineage was included and phylogenies were reconstructed using Bayesian and Maximum Likelihood approaches. Results support the monophyly of many proposed subfamilies but were not entirely congruent with either current classification system. The subfamilial relationships within the Cleridae are resolved with support for three main lineages. Tillinae are supported as the sister group to all other subfamilies within the Cleridae, whereas Thaneroclerinae, Korynetinae and a new subfamily formally described here, Epiclininae subf. n, form a sister group to Clerinae + Hydnocerinae.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Security models for two-party authenticated key exchange (AKE) protocols have developed over time to provide security even when the adversary learns certain secret keys. In this work, we advance the modelling of AKE protocols by considering more granular, continuous leakage of long-term secrets of protocol participants: the adversary can adaptively request arbitrary leakage of long-term secrets even after the test session is activated, with limits on the amount of leakage per query but no bounds on the total leakage. We present a security model supporting continuous leakage even when the adversary learns certain ephemeral secrets or session keys, and give a generic construction of a two-pass leakage-resilient key exchange protocol that is secure in the model; our protocol achieves continuous, after-the-fact leakage resilience with not much more cost than a previous protocol with only bounded, non-after-the-fact leakage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A key derivation function (KDF) is a function that transforms secret non-uniformly random source material together with some public strings into one or more cryptographic keys. These cryptographic keys are used with a cryptographic algorithm for protecting electronic data during both transmission over insecure channels and storage. In this thesis, we propose a new method for constructing a generic stream cipher based key derivation function. We show that our proposed key derivation function based on stream ciphers is secure if the under-lying stream cipher is secure. We simulate instances of this stream cipher based key derivation function using three eStream nalist: Trivium, Sosemanuk and Rabbit. The simulation results show these stream cipher based key derivation functions offer efficiency advantages over the more commonly used key derivation functions based on block ciphers and hash functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper gives an overview of an ongoing project endeavouring to advance theory-based production and project management, and the rationale for this approach is briefly justified. The status of the theoretical foundation of production management, project management and allied disciplines is discussed, with emphasis on metaphysical grounding of theories, as well as the nature of the heuristic solution method commonly used in these disciplines. Then, on-going work related to different aspects of production and project management is reviewed from both theoretical and practical orientation. Next, information systems agile project management is explored with a view to its re-use in generic project management. In production management, the consequences and implementation of a new, wider theoretical basis are analyzed. The theoretical implications and negative symptoms of the peculiarities of the construction industry for supply chains and supply chain management in construction are observed. Theoretical paths for improvements of inter-organisational relationships in construction which are fundamental for improvement of construction supply chains are described. To conclude, the observations made in this paper vis-à-vis production, project and supply chain management are related again to the theoretical basis of this paper, and finally directions for theory development and future research are given and discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is more apparel being created than ever before in history. The unsustainable production of materials and the clothing and textile waste that contributes annually to landfill, an estimated 500 000 tonnes of clothing per year in the UK (Gray, 2012) are significant issues inspiring the practice of Australian fashion designers, Carla van Lunn and Carla Binotto. While the contemporary fashion industry is built upon a production and consumption model that is younger than the industrial revolution, the traditions of costume, craft, and bodily adornment are ancient practices. Binotto and van Lunn believe that the potential for sustainable fashion practice lies outside the current industrial manufacturing model. This case study will discuss their fashion label, Maison Briz Vegas, and examine how recycling and traditional craft practices can be used to address the problem of clothing waste and offer an alternative idea of value in fashion and materials, addressing the indicative conference theme, Craft as Sustainability Activism in Practice. “Maison Briz Vegas”, a play on the notion of French luxury and the designers’ new world and sub-tropical home town, Brisbane, is an experimental and craft-based fashion label that uses second-hand cotton T-shirts and wool sweaters as primary materials to create designer fashion. The first collection, titled “The Wasteland”, was conceived and created in Paris in 2011, where designer Carla van Lunn had been living and working for several years. The collection was inspired by the precariousness of the global economy and concerns about climate change. The mountains of discarded clothing found at flea markets provided a textile resource from which van Lunn created a recycled hand-crafted fashion collection with an activist message and was shown to buyers and press during Paris Fashion Week. The label has since become a collaboration with fellow Australian designer Carla Binotto. The craft processes employed in Maison Briz Vegas’ up-cycled fashion collections include original hand block-printing, hand embroidery, quilting and patchwork. Taking an artisanal and slow approach, the designers work to create a hand touched imperfect style in a fashion market flooded with digital printing and fast mass-produced garments. The recycling extends to garment fastenings and embellishments, with discarded jar lids and bottle tops being used as buttons and within embroidery. This process transforms the material and aesthetic value of cheap and generic second-hand clothing and household waste. Maison Briz Vegas demonstrates the potential for craft and design to be an interface for environmental activism within the world of fashion. Presenting garments that are both high-design and thoughtfully recycled in a significant fashion context, such as Paris Fashion Week, Maison Briz Vegas has been able to engage a high-profile luxury fashion audience which has not traditionally considered sustainable or eco practices as relevant or desirable in themselves. The designers are studying how to apply their production model on a greater scale in order to fill commercial orders and reach a wider audience whilst maintaining the element of bespoke, limited edition, and slow hand-craft within their work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been considerable recent interest in the genetic, biological and epidemiological basis of mammographic density (MD), and the search for causative links between MD and breast cancer (BC) risk. This report will critically review the current literature on MD and summarize the current evidence for its association with BC. Keywords 'mammographic dens*', 'dense mammary tissue' or 'percent dens*' were used to search the existing literature in English on PubMed and Medline. All reports were critically analyzed. The data were assigned to one of the following aspects of MD: general association with BC, its relationship with the breast hormonal milieu, the cellular basis of MD, the generic variations of MD, and its significance in the clinical setting. MD adjusted for age, and BMI is associated with increased risk of BC diagnosis, advanced tumour stage at diagnosis and increased risk of both local recurrence and second primary cancers. The MD measures that predict BC risk have high heritability, and to date several genetic markers associated with BC risk have been found to also be associated with these MD risk predictors. Change in MD could be a predictor of the extent of chemoprevention with tamoxifen. Although the biological and genetic pathways that determine and perhaps modulate MD remain largely unresolved, significant inroads are being made into the understanding of MD, which may lead to benefits in clinical screening, assessment and treatment strategies. This review provides a timely update on the current understanding of MD's association with BC risk.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tumour heterogeneity is a key characteristic of cancer and has significant implications relating to tumour response to chemotherapy as well as patient prognosis and potential relapse. It is being increasingly accepted that tumours are clonal in origin, suggestive of a tumour arising from a deregulated or mutated cell. Cancer stem cells (CSC) possess these capabilities, and with appropriate intracellular triggers and/or signalling from extracellular environments, can purportedly differentiate to initiate tumour formation. Additionally through epithelial mesenchymal plasticity (EMP), where cells gain and maintain characteristics of both epithelial and mesenchymal cell types, epithelial-derived tumour cells have been shown to de-differentiate to acquire cancer stem attributes, which also impart chemotherapy resistance. This new paradigm places EMP centrally in the process of tumour progression and metastasis, as well as modulating drug response to current forms of chemotherapy. Furthermore, EMP and CSCs have been identified in cancers arising from different tissue types making it a possible generic therapeutic target in cancer biology. Using breast cancer (BrCa) as an example, we summarise here the current understanding of CSCs, the role of EMP in cancer biology - especially in CSCs and different molecular subtypes, and the implications this has for current and future cancer treatment strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Road infrastructure has been considered as one of the most expensive and extensive infrastructure assets of the built environment globally. This asset also impacts the natural environment significantly during different phases of life e.g. construction, use, maintenance and end-of-life. The growing emphasis for sustainable development to meet the needs of future generations requires mitigation of the environmental impacts of road infrastructure during all phases of life e.g. construction, operation and end-of-life disposal (as required). Life-cycle analysis (LCA), a method of quantification of all stages of life, has recently been studied to explore all the environmental components of road projects due to limitations of generic environmental assessments. The LCA ensures collection and assessment of the inputs and outputs relating to any potential environmental factor of any system throughout its life. However, absence of a defined system boundary covering all potential environmental components restricts the findings of the current LCA studies. A review of the relevant published LCA studies has identified that environmental components such as rolling resistance of pavement, effect of solar radiation on pavement(albedo), traffic congestion during construction, and roadway lighting & signals are not considered by most of the studies. These components have potentially higher weightings for environment damage than several commonly considered components such as materials, transportation and equipment. This paper presents the findings of literature review, and suggests a system boundary model for LCA study of road infrastructure projects covering potential environmental components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Quality of life (QOL) measures are an important patient-relevant outcome measure for clinical studies. Currently there is no fully validated cough-specific QOL measure for paediatrics. The objective of this study was to validate a cough-specific QOL questionnaire for paediatric use. Method 43 children (28 males, 15 females; median age 29 months, IQR 20–41 months) newly referred for chronic cough participated. One parent of each child completed the 27-item Parent Cough-Specific QOL questionnaire (PC-QOL), and the generic child (Pediatric QOL Inventory 4.0 (PedsQL)) and parent QOL questionnaires (SF-12) and two cough-related measures (visual analogue score and verbal category descriptive score) on two occasions separated by 2–3 weeks. Cough counts were also objectively measured on both occasions. Results Internal consistency for both the domains and total PC-QOL at both test times was excellent (Cronbach alpha range 0.70–0.97). Evidence for repeatability and criterion validity was established, with significant correlations over time and significant relationships with the cough measures. The PC-QOL was sensitive to change across the test times and these changes were significantly related to changes in cough measures (PC-QOL with: verbal category descriptive score, rs=−0.37, p=0.016; visual analogue score, rs=−0.47, p=0.003). Significant correlations of the difference scores for the social domain of the PC-QOL and the domain and total scores of the PedsQL were also noted (rs=0.46, p=0.034). Conclusion The PC-QOL is a reliable and valid outcome measure that assesses QOL related to childhood cough at a given time point and measures changes in cough-specific QOL over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cough associated with exertion is often used as a surrogate marker of asthma. However, to date there are no studies that have objectively measured cough in association with exercise in children. Our primary aim was to examine whether children with a pre-existing cough have an increase in cough frequency during and post-exercise. We hypothesized that children with any coughing illness will have an increase in cough frequency post-exercise regardless of the presence of exercise-induced broncho-constriction (EIB) or atopy. In addition, we hypothesized that Fractional exhaled nitric oxide (FeNO) levels decreases post-exercise regardless of the presence of EIB or atopy. Children with chronic cough and a control group without cough undertook an exercise challenge, FeNO measurements and a skin prick test, and wore a 24-h voice recorder to objectively measure cough frequency. The association between recorded cough frequency, exercise, atopy, and presence of EIB was tested. We also determined if the change in FeNO post exercise related to atopy or EIB. Of the 50 children recruited (35 with cough, 15 control), 7 had EIB. Children with cough had a significant increase in cough counts (median 7.0, inter-quartile ranges, 0.5, 24.5) compared to controls (2.0, IQR 0, 5.0, p = 0.028) post-exercise. Presence of atopy or EIB did not influence cough frequency. FeNO level was significantly lower post-exercise in both groups but the change was not influenced by atopy or EIB. Cough post-exertion is likely a generic response in children with a current cough. FeNO level decreases post-exercise irrespective of the presence of atopy or EIB. A larger study is necessary confirm or refute our findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivation ?Task analysis for designing modern collaborative work needs a more fine grained approach. Especially in a complex task domain, like collaborative scientific authoring, when there is a single overall goal that can only be accomplished only by collaboration between multiple roles, each requiring its own expertise. We analyzed and re-considered roles, activities, and objects for design for complex collaboration contexts. Our main focus is on a generic approach to design for multiple roles and subtasks in a domain with a shared overall goal, which requires a detailed approach. Collaborative authoring is our current example. This research is incremental: an existing task analysis approach (GTA) is reconsidered by applying it to a case of complex collaboration. Our analysis shows that designing for collaboration indeed requires a refined approach to task modeling: GTA, in future, will need to consider tasks at the lowest level that can be delegated or mandates. These tasks need to be analyzed and redesigned in more in detail, along with the relevant task object.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conceptual modeling is an important tool for understanding and revealing weaknesses of business processes. Yet, the current practice in reengineering projects often considers simply the as-is process model as a brain-storming tool. This approach heavily relies on the intuition of the participants and misses a clear description of the quality requirements. Against this background, we identify four generic quality categories of business process quality, and populate them with quality requirements from related research. We refer to the resulting framework as the Quality of Business Process (QoBP) framework. Furthermore, we present the findings from applying the QoBP framework in a case study with a major Australian bank, showing that it helps to systematically fill the white space between as-is and to-be process modeling.