973 resultados para match
Resumo:
Many insect clades, especially within the Diptera (true flies), have been considered classically ‘Gondwanan’, with an inference that distributions derive from vicariance of the southern continents. Assessing the role that vicariance has played in the evolution of austral taxa requires testing the location and tempo of diversification and speciation against the well-established predictions of fragmentation of the ancient super-continent. Several early (anecdotal) hypotheses that current austral distributions originate from the breakup of Gondwana derive from studies of taxa within the family Chironomidae (non-biting midges). With the advent of molecular phylogenetics and biogeographic analytical software, these studies have been revisited and expanded to test such conclusions better. Here we studied the midge genus Stictocladius Edwards, from the subfamily Orthocladiinae, which contains austral-distributed clades that match vicariance-based expectations. We resolve several issues of systematic relationships among morphological species and reveal cryptic diversity within many taxa. Time-calibrated phylogenetic relationships among taxa accorded partially with the predicted tempo from geology. For these apparently vagile insects, vicariance-dated patterns persist for South America and Australia. However, as often found, divergence time estimates for New Zealand at c. 50 mya post-date separation of Zealandia from Antarctica and the remainder of Gondwana, but predate the proposed Oligocene ‘drowning’ of these islands. We detail other such ‘anomalous’ dates and suggest a single common explanation rather than stochastic processes. This could involve synchronous establishment following recovery from ‘drowning’ and/or deleteriously warming associated with the mid-Eocene climatic optimum (hence ‘waving’, which refers to cycles of drowning events) plus new availability of topography providing of cool running waters, or all these factors in combination. Alternatively a vicariance explanation remains available, given the uncertain duration of connectivity of Zealandia to Australia–Antarctic–South America via the Lord Howe and Norfolk ridges into the Eocene.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.
Resumo:
Structural damage detection using measured dynamic data for pattern recognition is a promising approach. These pattern recognition techniques utilize artificial neural networks and genetic algorithm to match pattern features. In this study, an artificial neural network–based damage detection method using frequency response functions is presented, which can effectively detect nonlinear damages for a given level of excitation. The main objective of this article is to present a feasible method for structural vibration–based health monitoring, which reduces the dimension of the initial frequency response function data and transforms it into new damage indices and employs artificial neural network method for detecting different levels of nonlinearity using recognized damage patterns from the proposed algorithm. Experimental data of the three-story bookshelf structure at Los Alamos National Laboratory are used to validate the proposed method. Results showed that the levels of nonlinear damages can be identified precisely by the developed artificial neural networks. Moreover, it is identified that artificial neural networks trained with summation frequency response functions give higher precise damage detection results compared to the accuracy of artificial neural networks trained with individual frequency response functions. The proposed method is therefore a promising tool for structural assessment in a real structure because it shows reliable results with experimental data for nonlinear damage detection which renders the frequency response function–based method convenient for structural health monitoring.
Resumo:
We present a rigorous validation of the analyticalAmadei solution for the stress concentration around arbitrarily orientated borehole in general anisotropic elastic media. First, we revisit the theoretical framework of the Amadei solution and present analytical insights that show that the solution does indeed contain all special cases of symmetry, contrary to previous understanding, provided that the reduced strain coefficients β11 and β55 are not equal. It is shown from theoretical considerations and published experimental data that the β11 and β55 are not equal for realistic rocks. Second, we develop a 3D finite-element elastic model within a hybrid analyticalnumerical workflow that circumvents the need to rebuild and remesh the model for every borehole and material orientation. Third, we show that the borehole stresses computed from the numerical model and the analytical solution match almost perfectly for different borehole orientations (vertical, deviated and horizontal) and for several cases involving isotropic and transverse isotropic symmetries. It is concluded that the analytical Amadei solution is valid with no restrictions on the borehole orientation or elastic anisotropy symmetry.
Resumo:
Functional MRI studies commonly refer to activation patterns as being localized in specific Brodmann areas, referring to Brodmann’s divisions of the human cortex based on cytoarchitectonic boundaries [3]. Typically, Brodmann areas that match regions in the group averaged functional maps are estimated by eye, leading to inaccurate parcellations and significant error. To avoid this limitation, we developed a method using high-dimensional nonlinear registration to project the Brodmann areas onto individual 3D co-registered structural and functional MRI datasets, using an elastic deformation vector field in the cortical parameter space. Based on a sulcal pattern matching approach [11], an N=27 scan single subject atlas (the Colin Holmes atlas [15]) with associated Brodmann areas labeled on its surface, was deformed to match 3D cortical surface models generated from individual subjects’ structural MRIs (sMRIs). The deformed Brodmann areas were used to quantify and localize functional MRI (fMRI) BOLD activation during the performance of the Tower of London task [7].
Resumo:
The development of a protein-mediated dual functional affinity adsorption of plasmid DNA is described in this work. The affinity ligand for the plasmid DNA comprises a fusion protein with glutathione-S-transferase (GST) as the fusion partner with a zinc finger protein. The protein ligand is first bound to the adsorbent by affinity interaction between the GST moeity and gluthathione that is covalently immobilized to the base matrix. The plasmid binding is then enabled via the zinc finger protein and a specific nucleotide sequence inserted into the DNA. At lower loadings, the binding of the DNA onto the Fractogel, Sepharose, and Streamline matrices was 0.0078 ± 0.0013, 0.0095 ± 0.0016, and 0.0080 ± 0.0006 mg, respectively, to 50 μL of adsorbent. At a higher DNA challenge, the corresponding amounts were 0.0179 ± 0.0043, 0.0219 ± 0.0035, and 0.0190 ± 0.0041 mg, respectively. The relatively constant amounts bound to the three adsorbents indicated that the large DNA molecule was unable to utilize the available zinc finger sites that were located in the internal pores and binding was largely a surface adsorption phenomenon. Utilization of the zinc finger binding sites was shown to be highest for the Fractogel adsorbent. The adsorbed material was eluted with reduced glutathione, and the eluted efficiency for the DNA was between 23% and 27%. The protein elution profile appeared to match the adsorption profiles with significantly higher recoveries of bound GST-zinc finger protein.
Resumo:
Due to their unobtrusive nature, vision-based approaches to tracking sports players have been preferred over wearable sensors as they do not require the players to be instrumented for each match. Unfortunately however, due to the heavy occlusion between players, variation in resolution and pose, in addition to fluctuating illumination conditions, tracking players continuously is still an unsolved vision problem. For tasks like clustering and retrieval, having noisy data (i.e. missing and false player detections) is problematic as it generates discontinuities in the input data stream. One method of circumventing this issue is to use an occupancy map, where the field is discretised into a series of zones and a count of player detections in each zone is obtained. A series of frames can then be concatenated to represent a set-play or example of team behaviour. A problem with this approach though is that the compressibility is low (i.e. the variability in the feature space is incredibly high). In this paper, we propose the use of a bilinear spatiotemporal basis model using a role representation to clean-up the noisy detections which operates in a low-dimensional space. To evaluate our approach, we used a fully instrumented field-hockey pitch with 8 fixed high-definition (HD) cameras and evaluated our approach on approximately 200,000 frames of data from a state-of-the-art real-time player detector and compare it to manually labeled data.
Resumo:
There have been substantial advances in small field dosimetry techniques and technologies, over the last decade, which have dramatically improved the achievable accuracy of small field dose measurements. This educational note aims to help radiation oncology medical physicists to apply some of these advances in clinical practice. The evaluation of a set of small field output factors (total scatter factors) is used to exemplify a detailed measurement and simulation procedure and as a basis for discussing the possible effects of simplifying that procedure. Field output factors were measured with an unshielded diode and a micro-ionisation chamber, at the centre of a set of square fields defined by a micro-multileaf collimator. Nominal field sizes investigated ranged from 6×6 to 98×98 mm2. Diode measurements in fields smaller than 30 mm across were corrected using response factors calculated using Monte Carlo simulations of the full diode geometry and daisy-chained to match micro-chamber measurements at intermediate field sizes. Diode measurements in fields smaller than 15 mm across were repeated twelve times over three separate measurement sessions, to evaluate the to evaluate the reproducibility of the radiation field size and its correspondence with the nominal field size. The five readings that contributed to each measurement on each day varied by up to 0.26%, for the “very small” fields smaller than 15 mm, and 0.18% for the fields larger than 15 mm. The diode response factors calculated for the unshielded diode agreed with previously published results, within 1.6%. The measured dimensions of the very small fields differed by up to 0.3 mm, across the different measurement sessions, contributing an uncertainty of up to 1.2% to the very small field output factors. The overall uncertainties in the field output factors were 1.8% for the very small fields and 1.1% for the fields larger than 15 mm across. Recommended steps for acquiring small field output factor measurements for use in radiotherapy treatment planning system beam configuration data are provided.
Resumo:
Alignment-free methods, in which shared properties of sub-sequences (e.g. identity or match length) are extracted and used to compute a distance matrix, have recently been explored for phylogenetic inference. However, the scalability and robustness of these methods to key evolutionary processes remain to be investigated. Here, using simulated sequence sets of various sizes in both nucleotides and amino acids, we systematically assess the accuracy of phylogenetic inference using an alignment-free approach, based on D2 statistics, under different evolutionary scenarios. We find that compared to a multiple sequence alignment approach, D2 methods are more robust against among-site rate heterogeneity, compositional biases, genetic rearrangements and insertions/deletions, but are more sensitive to recent sequence divergence and sequence truncation. Across diverse empirical datasets, the alignment-free methods perform well for sequences sharing low divergence, at greater computation speed. Our findings provide strong evidence for the scalability and the potential use of alignment-free methods in large-scale phylogenomics.
Resumo:
The primary purpose of this paper is to overview a selection of advanced water treatment technology systems that are suited for application in towns and settlements in remote and very remote regions of Australia and vulnerable and lagging rural regions in Sri Lanka. This recognises that sanitation and water treatment are inextricably linked and both are needed to reduce risks to environment and population health from contaminated water sources. For both Australia and Sri Lanka only a small fraction of the settlements in rural and remote regions are connected to water treatment facilities and town water supplies. In Australia’s remote/very remote regions raw water is drawn from underground sources and rainwater capture. Most settlements in rural Sri Lanka rely on rivers, reservoirs, wells, springs or carted water. Furthermore, Sri Lanka has more than 25,000 hand pumped tube wells which saved the communities during recent droughts. Decentralised water supply systems offer the opportunity to provide safe drinking water to these remote/very remote and rural regions where centralised systems are not feasible due to socio-cultural, economic, political, technological reasons. These systems reduce health risks from contaminated water supplies. In remote areas centralized systems fail due to low population density and less affordability. Globally, a new generation of advanced water treatment technologies are positioned to make a major impact on the provision of safe potable water in remote/very remote regions in Australia and rural regions in Sri Lanka. Some of these systems were developed for higher income countries. However, with careful selection and further research they can be tailored to match local socio-economic conditions and technical capacity. As such, they can equally be used to provide decentralised water supply in communities in developed and developing countries such as Australia and Sri Lanka.
Resumo:
There is a concern that high densities of elephants in southern Africa could lead to the overall reduction of other forms of biodiversity. We present a grid-based model of elephant-savanna dynamics, which differs from previous elephant-vegetation models by accounting for woody plant demographics, tree-grass interactions, stochastic environmental variables (fire and rainfall), and spatial contagion of fire and tree recruitment. The model projects changes in height structure and spatial pattern of trees over periods of centuries. The vegetation component of the model produces long-term tree-grass coexistence, and the emergent fire frequencies match those reported for southern African savannas. Including elephants in the savanna model had the expected effect of reducing woody plant cover, mainly via increased adult tree mortality, although at an elephant density of 1.0 elephant/km2, woody plants still persisted for over a century. We tested three different scenarios in addition to our default assumptions. (1) Reducing mortality of adult trees after elephant use, mimicking a more browsing-tolerant tree species, mitigated the detrimental effect of elephants on the woody population. (2) Coupling germination success (increased seedling recruitment) to elephant browsing further increased tree persistence, and (3) a faster growing woody component allowed some woody plant persistence for at least a century at a density of 3 elephants/km2. Quantitative models of the kind presented here provide a valuable tool for exploring the consequences of management decisions involving the manipulation of elephant population densities. © 2005 by the Ecological Society of America.
Resumo:
PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.
Resumo:
Viewer interests, evoked by video content, can potentially identify the highlights of the video. This paper explores the use of facial expressions (FE) and heart rate (HR) of viewers captured using camera and non-strapped sensor for identifying interesting video segments. The data from ten subjects with three videos showed that these signals are viewer dependent and not synchronized with the video contents. To address this issue, new algorithms are proposed to effectively combine FE and HR signals for identifying the time when viewer interest is potentially high. The results show that, compared with subjective annotation and match report highlights, ‘non-neutral’ FE and ‘relatively higher and faster’ HR is able to capture 60%-80% of goal, foul, and shot-on-goal soccer video events. FE is found to be more indicative than HR of viewer’s interests, but the fusion of these two modalities outperforms each of them.
Resumo:
This study set out to determine which values are represented in the National Framework for Values Education in Australian Schools by matching Schwartz's ten values constructs to the Nine Values for Australian Schooling and examining the values orientations of contemporary young people, specifically Grade 8 girls from one State High School in South East Queensland. This was achieved by using the Schwartz Portrait Values Questionnaire (PVQ) as well as thematic analysis. Key findings were that there was a match between the Grade 8 girls values and some of the Nine Values however not others. Also, that not all of Schwartz's values are represented in the Nine Values for Australian Schooling, and certain values could be said to be omitted from the Framework and certain privileged.