236 resultados para MATCH
Resumo:
As organizations reach higher levels of business process management maturity, they often find themselves maintaining very large process model repositories, representing valuable knowledge about their operations. A common practice within these repositories is to create new process models, or extend existing ones, by copying and merging fragments from other models. We contend that if these duplicate fragments, a.k.a. ex- act clones, can be identified and factored out as shared subprocesses, the repository’s maintainability can be greatly improved. With this purpose in mind, we propose an indexing structure to support fast detection of clones in process model repositories. Moreover, we show how this index can be used to efficiently query a process model repository for fragments. This index, called RPSDAG, is based on a novel combination of a method for process model decomposition (namely the Refined Process Structure Tree), with established graph canonization and string matching techniques. We evaluated the RPSDAG with large process model repositories from industrial practice. The experiments show that a significant number of non-trivial clones can be efficiently found in such repositories, and that fragment queries can be handled efficiently.
Resumo:
The monogeneric family Fergusoninidae consists of gall-forming flies that, together with Fergusobia (Tylenchida: Neotylenchidae) nematodes, form the only known mutualistic association between insects and nematodes. In this study, the entire 16,000 bp mitochondrial genome of Fergusonina taylori Nelson and Yeates was sequenced. The circular genome contains one encoding region including 27 genes and one non-coding A þT-rich region. The arrangement of the proteincoding, ribosomal RNA (rRNA) and transfer RNA (tRNA) genes was the same as that found in the ancestral insect. Nucleotide composition is highly A þ T biased. All of the protein initiation codons are ATN, except for nad1 which begins with TTT. All 22 tRNA anticodons of F. taylori match those observed in Drosophila yakuba, and all form the typical cloverleaf structure except for tRNA-Ser (AGN) which lacks a dihydrouridine (DHU) arm. Secondary structural features of the rRNA genes of Fergusonina are similar to those proposed for other insects, with minor modifications. The mitochondrial genome of Fergusonina presented here may prove valuable for resolving the sister group to the Fergusoninidae, and expands the available mtDNA data sources for acalyptrates overall.
Resumo:
Safety culture is a concept that has long been accepted in high risk industries such as aviation, nuclear industries and mining, however, considerable research is now also being undertaken within the construction sector. This paper discusses three recent interlocked projects undertaken in the Australian construction industry. The first project examined the development and implementation of a safety competency framework targeted at safety critical positions (SCP's) across first tier construction organisations. Combining qualitative and quantitative methods, the project: developed a matrix of SCP's (n=11) and safety management tasks (SMTs; n=39); mapped the process steps for their acquisition and development; detailed the knowledge, skills and behaviours required for all SMTs; and outlined potential organisational cultural outcomes from a successful implementation of the framework. The second project extended this research to develop behavioural guidelines for leaders to drive safety culture change down to second tier companies and to assist them to customise their own competency framework and implementation guidelines to match their aspirations and resources. The third interlocked project explored the use of safety effectiveness indicators (SEIs) as an industry-relevant assessment tool for reducing risk on construction sites. With direct linkages to safety competencies and SMT's, the SEIs are the next step towards an integrated safety cultural approach to safety and extend the concept of positive performance indicators (PPIs) by providing a valid, reliable, and user friendly measurement platform. Taken together, the results of the interlocked projects suggest that industry engaged collaborative safety culture research has many potential benefits for the construction industry.
Resumo:
In this video, a male voice recites a script comprised entirely of jokes. Words flash on screen in time with the spoken words. Sometimes the two sets of words match, and sometimes they differ. This work examines processes of signification. It emphasizes disruption and disconnection as fundamental and generative operations in making meaning. Extending on post-structural and deconstructionist ideas, this work questions the relationship between written and spoken words. By deliberately confusing the signifying structures of jokes and narratives, it questions the sites and mechanisms of comprehension, humour and signification.
Resumo:
In this video, a male voice recites a teenage love poem. Words flash on screen in time with the spoken words. Sometimes the two sets of words match, and sometimes they differ. This work examines processes of signification. It emphasizes disruption and disconnection as fundamental and generative operations in making meaning. Extending on post-structural and deconstructionist ideas, this work questions the relationship between written and spoken words. By actively disrupting the sincerity of a teenage love poem, it questions the sites and mechanisms of comprehension, poetry and signification.
Resumo:
IN Harold Pinter's No Man's Land, four Englishmen find themselves trapped in a strange sort of limbo in the library of a well-to-do-house with musty old books, far too many bottles of beer, wine and spirits, dusty lamps and memories that never quite match up.
Resumo:
Transnational Organised Crime (TOC) has become a focal point for a range of private and public stakeholders. While not a new phenomenon, the rapid expansion of TOC activities and interests, its increasingly complex structures and ability to maximise opportunity by employing new technologies at a rate impossible for law enforcement to match complicates law enforcement’s ability to develop strategies to detect, disrupt, prevent and investigate them. In an age where the role of police has morphed from simplistic response and enforcement activities to one of managing human security risk, it is argued that intelligence can be used to reduce the impact of strategic surprise from evolving criminal threats and environmental change. This review specifically focuses on research that has implications for strategic intelligence and strategy setting in a TOC context. The review findings suggest that current law enforcement intelligence literature focuses narrowly on the management concept of intelligence-led policing in a tactical, operational setting. As such the review identifies central issues surrounding strategic intelligence and highlights key questions that future research agendas must address to improve strategic intelligence outcomes, particularly in the fight against TOC.
Resumo:
A practical method for the design of dual-band decoupling and matching networks (DMN) for two closely spaced antennas using discrete components is presented. The DMN reduces the port-to-port coupling and enhances the diversity of the antennas. By applying the DMN, the radiation efficiency can also be improved when one port is fed and the other port is match terminated. The proposed DMN works at two frequencies simultaneously without the need for any switch. As a proof of concept, a dual-band DMN for a pair of monopoles spaced 0.05λ apart is designed. The measured return loss and port isolation exceed 10 dB from 1.71 GHz to 1.76 GHz and from 2.27 GHz to 2.32 GHz.
Resumo:
Time-varying bispectra, computed using a classical sliding window short-time Fourier approach, are analyzed for scalp EEG potentials evoked by an auditory stimulus and new observations are presented. A single, short duration tone is presented from the left or the right, direction unknown to the test subject. The subject responds by moving the eyes to the direction of the sound. EEG epochs sampled at 200 Hz for repeated trials are processed between -70 ms and +1200 ms with reference to the stimulus. It is observed that for an ensemble of correctly recognized cases, the best matching timevarying bispectra at (8 Hz, 8Hz) are for PZ-FZ channels and this is also largely the case for grand averages but not for power spectra at 8 Hz. Out of 11 subjects, the only exception for time-varying bispectral match was a subject with family history of Alzheimer’s disease and the difference was in bicoherence, not biphase.
Resumo:
The history of political blogging in Australia does not entirely match the development of blogospheres in other countries. Even at its beginning, blogging was not an entirely alternative endeavour – one of the first news or political blogs was Margo Kingston’s Webdiary, hosted by the Sydney Morning Herald. In the United States, whose political blogosphere has been examined most comprehensively in the literature (see e.g. Adamic & Glance, 2005; Drezner & Farrell, 2008; Shaw & Benkler, 2012; Tremayne, 2007; Wallsten, 2008), blogging had a clear historical trajectory from alternative to mainstream medium. The Australian blogosphere, by contrast, has seen early and continued involvement from representatives of the mainstream media, blogging both for their employers and independently (Garden, 2010). Coupled with the incorporation of blog-like technologies into news websites, as well as with obvious differences in the size of the available talent pool and potential audience for political blogging in Australia, this recognition of blogging by the mainstream media may be one reason why, in political and news discussions at least, Australian bloggers did not bring about their own, local equivalents to the resignations of Dan Rather or Trent Lott in the U.S. –events which were commonly attributed in part to the work of bloggers (Simons, 2007). However, the acceptance of the blogging concept by the mainstream media has been accompanied by a comparative lack of acceptance towards individual bloggers. Analyses and commentary published by bloggers have been attacked by journalists, creating an at times antagonistic relationship between the mainstream media and bloggers (Flew & Wilson, 2010; Young, 2011). In this article, we examine the historical development of blogging in Australia, focussing primarily on political and news blogs. In particular, we review who the bloggers are and how the connections between different blogs and other titles have changed over the past decade. The paper tracks the evolution of individual and group blogs, independent and mainstream media-hosted opinion sites, and the gradual convergence of these platforms and their associated contributing authors. We conclude by examining the current state of the Australian blogosphere and its likely future development, taking into account the rise of social media, and in particular Twitter, as additional spaces for public commentary.
Resumo:
Cities have long held a fascination for people – as they grow and develop, there is a desire to know and understand the intricate interplay of elements that makes cities ‘live’. In part, this is a need for even greater efficiency in urban centres, yet the underlying quest is for a sustainable urban form. In order to make sense of the complex entities that we recognise cities to be, they have been compared to buildings, organisms and more recently machines. However the search for better and more elegant urban centres is hardly new, healthier and more efficient settlements were the aim of Modernism’s rational sub-division of functions, which has been translated into horizontal distribution through zoning, or vertical organisation thought highrise developments. However both of these approaches have been found to be unsustainable, as too many resources are required to maintain this kind or urbanisation and social consequences of either horizontal or vertical isolation must also be considered. From being absolute consumers of resources, of energy and of technology, cities need to change, to become sustainable in order to be more resilient and more efficient in supporting culture, society as well as economy. Our urban centres need to be re-imagined, re-conceptualised and re-defined, to match our changing society. One approach is to re-examine the compartmentalised, mono-functional approach of urban Modernism and to begin to investigate cities like ecologies, where every element supports and incorporates another, fulfilling more than just one function. This manner of seeing the city suggests a framework to guide the re-mixing of urban settlements. Beginning to understand the relationships between supporting elements and the nature of the connecting ‘web’ offers an invitation to investigate the often ignored, remnant spaces of cities. This ‘negative space’ is the residual from which space and place are carved out in the Contemporary city, providing the link between elements of urban settlement. Like all successful ecosystems, cities need to evolve and change over time in order to effectively respond to different lifestyles, development in culture and society as well as to meet environmental challenges. This paper seeks to investigate the role that negative space could have in the reorganisation of the re-mixed city. The space ‘in-between’ is analysed as an opportunity for infill development or re-development which provides to the urban settlement the variety that is a pre-requisite for ecosystem resilience. An analysis of the urban form is suggested as an empirical tool to map the opportunities already present in the urban environment and negative space is evaluated as a key element in achieving a positive development able to distribute diverse environmental and social facilities in the city.
Resumo:
Learning and then recognizing a route, whether travelled during the day or at night, in clear or inclement weather, and in summer or winter is a challenging task for state of the art algorithms in computer vision and robotics. In this paper, we present a new approach to visual navigation under changing conditions dubbed SeqSLAM. Instead of calculating the single location most likely given a current image, our approach calculates the best candidate matching location within every local navigation sequence. Localization is then achieved by recognizing coherent sequences of these “local best matches”. This approach removes the need for global matching performance by the vision front-end - instead it must only pick the best match within any short sequence of images. The approach is applicable over environment changes that render traditional feature-based techniques ineffective. Using two car-mounted camera datasets we demonstrate the effectiveness of the algorithm and compare it to one of the most successful feature-based SLAM algorithms, FAB-MAP. The perceptual change in the datasets is extreme; repeated traverses through environments during the day and then in the middle of the night, at times separated by months or years and in opposite seasons, and in clear weather and extremely heavy rain. While the feature-based method fails, the sequence-based algorithm is able to match trajectory segments at 100% precision with recall rates of up to 60%.
Resumo:
Even though titanium dioxide photocatalysis has been promoted as a leading green technology for water purification, many issues have hindered its application on a large commercial scale. For the materials scientist the main issues have centred the synthesis of more efficient materials and the investigation of degradation mechanisms; whereas for the engineers the main issues have been the development of appropriate models and the evaluation of intrinsic kinetics parameters that allow the scale up or re-design of efficient large-scale photocatalytic reactors. In order to obtain intrinsic kinetics parameters the reaction must be analysed and modelled considering the influence of the radiation field, pollutant concentrations and fluid dynamics. In this way, the obtained kinetic parameters are independent of the reactor size and configuration and can be subsequently used for scale-up purposes or for the development of entirely new reactor designs. This work investigates the intrinsic kinetics of phenol degradation over titania film due to the practicality of a fixed film configuration over a slurry. A flat plate reactor was designed in order to be able to control reaction parameters that include the UV irradiance, flow rates, pollutant concentration and temperature. Particular attention was paid to the investigation of the radiation field over the reactive surface and to the issue of mass transfer limited reactions. The ability of different emission models to describe the radiation field was investigated and compared to actinometric measurements. The RAD-LSI model was found to give the best predictions over the conditions tested. Mass transfer issues often limit fixed film reactors. The influence of this phenomenon was investigated with specifically planned sets of benzoic acid experiments and with the adoption of the stagnant film model. The phenol mass transfer coefficient in the system was calculated to be km,phenol=8.5815x10-7Re0.65(ms-1). The data obtained from a wide range of experimental conditions, together with an appropriate model of the system, has enabled determination of intrinsic kinetic parameters. The experiments were performed in four different irradiation levels (70.7, 57.9, 37.1 and 20.4 W m-2) and combined with three different initial phenol concentrations (20, 40 and 80 ppm) to give a wide range of final pollutant conversions (from 22% to 85%). The simple model adopted was able to fit the wide range of conditions with only four kinetic parameters; two reaction rate constants (one for phenol and one for the family of intermediates) and their corresponding adsorption constants. The intrinsic kinetic parameters values were defined as kph = 0.5226 mmol m-1 s-1 W-1, kI = 0.120 mmol m-1 s-1 W-1, Kph = 8.5 x 10-4 m3 mmol-1 and KI = 2.2 x 10-3 m3 mmol-1. The flat plate reactor allowed the investigation of the reaction under two different light configurations; liquid and substrate side illumination. The latter of particular interest for real world applications where light absorption due to turbidity and pollutants contained in the water stream to be treated could represent a significant issue. The two light configurations allowed the investigation of the effects of film thickness and the determination of the catalyst optimal thickness. The experimental investigation confirmed the predictions of a porous medium model developed to investigate the influence of diffusion, advection and photocatalytic phenomena inside the porous titania film, with the optimal thickness value individuated at 5 ìm. The model used the intrinsic kinetic parameters obtained from the flat plate reactor to predict the influence of thickness and transport phenomena on the final observed phenol conversion without using any correction factor; the excellent match between predictions and experimental results provided further proof of the quality of the parameters obtained with the proposed method.
Resumo:
We introduce the concept of Revocable Predicate Encryption (RPE), which extends current predicate encryption setting with revocation support: private keys can be used to decrypt an RPE ciphertext only if they match the decryption policy (defined via attributes encoded into the ciphertext and predicates associated with private keys) and were not revoked by the time the ciphertext was created. We formalize the notion of attribute hiding in the presence of revocation and propose an RPE scheme, called AH-RPE, which achieves attribute-hiding under the Decision Linear assumption in the standard model. We then present a stronger privacy notion, termed full hiding, which further cares about privacy of revoked users. We propose another RPE scheme, called FH-RPE, that adopts the Subset Cover Framework and offers full hiding under the Decision Linear assumption in the standard model. The scheme offers very flexible privacy-preserving access control to encrypted data and can be used in sender-local revocation scenarios.
Resumo:
Background Cohort studies can provide valuable evidence of cause and effect relationships but are subject to loss of participants over time, limiting the validity of findings. Computerised record linkage offers a passive and ongoing method of obtaining health outcomes from existing routinely collected data sources. However, the quality of record linkage is reliant upon the availability and accuracy of common identifying variables. We sought to develop and validate a method for linking a cohort study to a state-wide hospital admissions dataset with limited availability of unique identifying variables. Methods A sample of 2000 participants from a cohort study (n = 41 514) was linked to a state-wide hospitalisations dataset in Victoria, Australia using the national health insurance (Medicare) number and demographic data as identifying variables. Availability of the health insurance number was limited in both datasets; therefore linkage was undertaken both with and without use of this number and agreement tested between both algorithms. Sensitivity was calculated for a sub-sample of 101 participants with a hospital admission confirmed by medical record review. Results Of the 2000 study participants, 85% were found to have a record in the hospitalisations dataset when the national health insurance number and sex were used as linkage variables and 92% when demographic details only were used. When agreement between the two methods was tested the disagreement fraction was 9%, mainly due to "false positive" links when demographic details only were used. A final algorithm that used multiple combinations of identifying variables resulted in a match proportion of 87%. Sensitivity of this final linkage was 95%. Conclusions High quality record linkage of cohort data with a hospitalisations dataset that has limited identifiers can be achieved using combinations of a national health insurance number and demographic data as identifying variables.