255 resultados para invalid match


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the effect of topic dependent language models (TDLM) on phonetic spoken term detection (STD) using dynamic match lattice spotting (DMLS). Phonetic STD consists of two steps: indexing and search. The accuracy of indexing audio segments into phone sequences using phone recognition methods directly affects the accuracy of the final STD system. If the topic of a document in known, recognizing the spoken words and indexing them to an intermediate representation is an easier task and consequently, detecting a search word in it will be more accurate and robust. In this paper, we propose the use of TDLMs in the indexing stage to improve the accuracy of STD in situations where the topic of the audio document is known in advance. It is shown that using TDLMs instead of the traditional general language model (GLM) improves STD performance according to figure of merit (FOM) criteria.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite rising levels of safe-sex knowledge in Australia, sexually transmitted infection notifications continue to increase. A culture-centred approach suggests it is useful in attempting to reach a target population first to understand their perspective on the issues. Twenty focus groups were conducted with 89 young people between the ages of 14 and 16 years. Key findings suggest that scientific information does not articulate closely with everyday practice, that young people get the message that sex is bad and they should not be preparing for it and that it is not appropriate to talk about sex. Understanding how young people think about these issues is particularly important because the focus groups also found that young people disengage from sources of information that do not match their own experiences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Empirical evidence shows that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication arises for example when the repository covers multiple variants of the same processes or due to copy-pasting. Previous work has addressed the problem of efficiently retrieving exact clones that can be refactored into shared subprocess models. This article studies the broader problem of approximate clone detection in process models. The article proposes techniques for detecting clusters of approximate clones based on two well-known clustering algorithms: DBSCAN and Hi- erarchical Agglomerative Clustering (HAC). The article also defines a measure of standardizability of an approximate clone cluster, meaning the potential benefit of replacing the approximate clones with a single standardized subprocess. Experiments show that both techniques, in conjunction with the proposed standardizability measure, accurately retrieve clusters of approximate clones that originate from copy-pasting followed by independent modifications to the copied fragments. Additional experiments show that both techniques produce clusters that match those produced by human subjects and that are perceived to be standardizable.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Silver nanoparticles with identical plasmonic properties but different surface functionalities are synthesized and tested as chemically selective surface-enhanced resonance Raman (SERR) amplifiers in a two-component protein solution. The surface plasmon resonances of the particles are tuned to 413 nm to match the molecular resonance of protein heme cofactors. Biocompatible functionalization of the nanoparticles with a thin film of chitosan yields selective SERR enhancement of the anionic protein cytochrome b5, whereas functionalization with SiO2 amplifies only the spectra of the cationic protein cytochrome c. As a result, subsequent addition of the two differently functionalized particles yields complementary information on the same mixed protein sample solution. Finally, the applicability of chitosan-coated Ag nanoparticles for protein separation was tested by in situ resonance Raman spectroscopy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Did SBS chief executive Michael Ebeid score a well-timed free kick or an own goal in his attack on the ABC this week? The ABC recently secured the free-to-air television rights for the Asian Cup football tournament to be held in Australia early next year, together with tonight’s match between the Socceroos and Japan. A lower bid by SBS – still in some circles fondly known as the “Soccer Broadcasting Service” – was rejected, dealing a significant blow to the smaller public broadcaster. The ABC was reportedly asked to make a bid by Football Federation Australia. The FFA presumably believes the ABC’s coverage will attract larger audiences to the game. This is despite SBS’s long-term success with the sport. It should not be forgotten, however, that while SBS has largely been defined by its long connection with the world game, ABC was the home of football from the late 1950s until the 1980s. But the stoush is only partly about football. It was surely no coincidence that it comes on the eve of the government’s formal announcement of the size of the cuts to public broadcasting...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many insect clades, especially within the Diptera (true flies), have been considered classically ‘Gondwanan’, with an inference that distributions derive from vicariance of the southern continents. Assessing the role that vicariance has played in the evolution of austral taxa requires testing the location and tempo of diversification and speciation against the well-established predictions of fragmentation of the ancient super-continent. Several early (anecdotal) hypotheses that current austral distributions originate from the breakup of Gondwana derive from studies of taxa within the family Chironomidae (non-biting midges). With the advent of molecular phylogenetics and biogeographic analytical software, these studies have been revisited and expanded to test such conclusions better. Here we studied the midge genus Stictocladius Edwards, from the subfamily Orthocladiinae, which contains austral-distributed clades that match vicariance-based expectations. We resolve several issues of systematic relationships among morphological species and reveal cryptic diversity within many taxa. Time-calibrated phylogenetic relationships among taxa accorded partially with the predicted tempo from geology. For these apparently vagile insects, vicariance-dated patterns persist for South America and Australia. However, as often found, divergence time estimates for New Zealand at c. 50 mya post-date separation of Zealandia from Antarctica and the remainder of Gondwana, but predate the proposed Oligocene ‘drowning’ of these islands. We detail other such ‘anomalous’ dates and suggest a single common explanation rather than stochastic processes. This could involve synchronous establishment following recovery from ‘drowning’ and/or deleteriously warming associated with the mid-Eocene climatic optimum (hence ‘waving’, which refers to cycles of drowning events) plus new availability of topography providing of cool running waters, or all these factors in combination. Alternatively a vicariance explanation remains available, given the uncertain duration of connectivity of Zealandia to Australia–Antarctic–South America via the Lord Howe and Norfolk ridges into the Eocene.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural damage detection using measured dynamic data for pattern recognition is a promising approach. These pattern recognition techniques utilize artificial neural networks and genetic algorithm to match pattern features. In this study, an artificial neural network–based damage detection method using frequency response functions is presented, which can effectively detect nonlinear damages for a given level of excitation. The main objective of this article is to present a feasible method for structural vibration–based health monitoring, which reduces the dimension of the initial frequency response function data and transforms it into new damage indices and employs artificial neural network method for detecting different levels of nonlinearity using recognized damage patterns from the proposed algorithm. Experimental data of the three-story bookshelf structure at Los Alamos National Laboratory are used to validate the proposed method. Results showed that the levels of nonlinear damages can be identified precisely by the developed artificial neural networks. Moreover, it is identified that artificial neural networks trained with summation frequency response functions give higher precise damage detection results compared to the accuracy of artificial neural networks trained with individual frequency response functions. The proposed method is therefore a promising tool for structural assessment in a real structure because it shows reliable results with experimental data for nonlinear damage detection which renders the frequency response function–based method convenient for structural health monitoring.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a rigorous validation of the analyticalAmadei solution for the stress concentration around arbitrarily orientated borehole in general anisotropic elastic media. First, we revisit the theoretical framework of the Amadei solution and present analytical insights that show that the solution does indeed contain all special cases of symmetry, contrary to previous understanding, provided that the reduced strain coefficients β11 and β55 are not equal. It is shown from theoretical considerations and published experimental data that the β11 and β55 are not equal for realistic rocks. Second, we develop a 3D finite-element elastic model within a hybrid analyticalnumerical workflow that circumvents the need to rebuild and remesh the model for every borehole and material orientation. Third, we show that the borehole stresses computed from the numerical model and the analytical solution match almost perfectly for different borehole orientations (vertical, deviated and horizontal) and for several cases involving isotropic and transverse isotropic symmetries. It is concluded that the analytical Amadei solution is valid with no restrictions on the borehole orientation or elastic anisotropy symmetry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Functional MRI studies commonly refer to activation patterns as being localized in specific Brodmann areas, referring to Brodmann’s divisions of the human cortex based on cytoarchitectonic boundaries [3]. Typically, Brodmann areas that match regions in the group averaged functional maps are estimated by eye, leading to inaccurate parcellations and significant error. To avoid this limitation, we developed a method using high-dimensional nonlinear registration to project the Brodmann areas onto individual 3D co-registered structural and functional MRI datasets, using an elastic deformation vector field in the cortical parameter space. Based on a sulcal pattern matching approach [11], an N=27 scan single subject atlas (the Colin Holmes atlas [15]) with associated Brodmann areas labeled on its surface, was deformed to match 3D cortical surface models generated from individual subjects’ structural MRIs (sMRIs). The deformed Brodmann areas were used to quantify and localize functional MRI (fMRI) BOLD activation during the performance of the Tower of London task [7].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The development of a protein-mediated dual functional affinity adsorption of plasmid DNA is described in this work. The affinity ligand for the plasmid DNA comprises a fusion protein with glutathione-S-transferase (GST) as the fusion partner with a zinc finger protein. The protein ligand is first bound to the adsorbent by affinity interaction between the GST moeity and gluthathione that is covalently immobilized to the base matrix. The plasmid binding is then enabled via the zinc finger protein and a specific nucleotide sequence inserted into the DNA. At lower loadings, the binding of the DNA onto the Fractogel, Sepharose, and Streamline matrices was 0.0078 ± 0.0013, 0.0095 ± 0.0016, and 0.0080 ± 0.0006 mg, respectively, to 50 μL of adsorbent. At a higher DNA challenge, the corresponding amounts were 0.0179 ± 0.0043, 0.0219 ± 0.0035, and 0.0190 ± 0.0041 mg, respectively. The relatively constant amounts bound to the three adsorbents indicated that the large DNA molecule was unable to utilize the available zinc finger sites that were located in the internal pores and binding was largely a surface adsorption phenomenon. Utilization of the zinc finger binding sites was shown to be highest for the Fractogel adsorbent. The adsorbed material was eluted with reduced glutathione, and the eluted efficiency for the DNA was between 23% and 27%. The protein elution profile appeared to match the adsorption profiles with significantly higher recoveries of bound GST-zinc finger protein.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There have been substantial advances in small field dosimetry techniques and technologies, over the last decade, which have dramatically improved the achievable accuracy of small field dose measurements. This educational note aims to help radiation oncology medical physicists to apply some of these advances in clinical practice. The evaluation of a set of small field output factors (total scatter factors) is used to exemplify a detailed measurement and simulation procedure and as a basis for discussing the possible effects of simplifying that procedure. Field output factors were measured with an unshielded diode and a micro-ionisation chamber, at the centre of a set of square fields defined by a micro-multileaf collimator. Nominal field sizes investigated ranged from 6×6 to 98×98 mm2. Diode measurements in fields smaller than 30 mm across were corrected using response factors calculated using Monte Carlo simulations of the full diode geometry and daisy-chained to match micro-chamber measurements at intermediate field sizes. Diode measurements in fields smaller than 15 mm across were repeated twelve times over three separate measurement sessions, to evaluate the to evaluate the reproducibility of the radiation field size and its correspondence with the nominal field size. The five readings that contributed to each measurement on each day varied by up to 0.26%, for the “very small” fields smaller than 15 mm, and 0.18% for the fields larger than 15 mm. The diode response factors calculated for the unshielded diode agreed with previously published results, within 1.6%. The measured dimensions of the very small fields differed by up to 0.3 mm, across the different measurement sessions, contributing an uncertainty of up to 1.2% to the very small field output factors. The overall uncertainties in the field output factors were 1.8% for the very small fields and 1.1% for the fields larger than 15 mm across. Recommended steps for acquiring small field output factor measurements for use in radiotherapy treatment planning system beam configuration data are provided.