891 resultados para Vector Mk Landscapes
Resumo:
Calls from 14 species of bat were classified to genus and species using discriminant function analysis (DFA), support vector machines (SVM) and ensembles of neural networks (ENN). Both SVMs and ENNs outperformed DFA for every species while ENNs (mean identification rate – 97%) consistently outperformed SVMs (mean identification rate – 87%). Correct classification rates produced by the ENNs varied from 91% to 100%; calls from six species were correctly identified with 100% accuracy. Calls from the five species of Myotis, a genus whose species are considered difficult to distinguish acoustically, had correct identification rates that varied from 91 – 100%. Five parameters were most important for classifying calls correctly while seven others contributed little to classification performance.
Resumo:
This paper proposes a highly reliable fault diagnosis approach for low-speed bearings. The proposed approach first extracts wavelet-based fault features that represent diverse symptoms of multiple low-speed bearing defects. The most useful fault features for diagnosis are then selected by utilizing a genetic algorithm (GA)-based kernel discriminative feature analysis cooperating with one-against-all multicategory support vector machines (OAA MCSVMs). Finally, each support vector machine is individually trained with its own feature vector that includes the most discriminative fault features, offering the highest classification performance. In this study, the effectiveness of the proposed GA-based kernel discriminative feature analysis and the classification ability of individually trained OAA MCSVMs are addressed in terms of average classification accuracy. In addition, the proposedGA- based kernel discriminative feature analysis is compared with four other state-of-the-art feature analysis approaches. Experimental results indicate that the proposed approach is superior to other feature analysis methodologies, yielding an average classification accuracy of 98.06% and 94.49% under rotational speeds of 50 revolutions-per-minute (RPM) and 80 RPM, respectively. Furthermore, the individually trained MCSVMs with their own optimal fault features based on the proposed GA-based kernel discriminative feature analysis outperform the standard OAA MCSVMs, showing an average accuracy of 98.66% and 95.01% for bearings under rotational speeds of 50 RPM and 80 RPM, respectively.
Resumo:
Underwater wireless sensor networks (UWSNs) have become the seat of researchers' attention recently due to their proficiency to explore underwater areas and design different applications for marine discovery and oceanic surveillance. One of the main objectives of each deployed underwater network is discovering the optimized path over sensor nodes to transmit the monitored data to onshore station. The process of transmitting data consumes energy of each node, while energy is limited in UWSNs. So energy efficiency is a challenge in underwater wireless sensor network. Dual sinks vector based forwarding (DS-VBF) takes both residual energy and location information into consideration as priority factors to discover an optimized routing path to save energy in underwater networks. The modified routing protocol employs dual sinks on the water surface which improves network lifetime. According to deployment of dual sinks, packet delivery ratio and the average end to end delay are enhanced. Based on our simulation results in comparison with VBF, average end to end delay reduced more than 80%, remaining energy increased 10%, and the increment of packet reception ratio was about 70%.
Resumo:
Historic landscapes today are changing gradually or abruptly, and the abrupt changes have caused the loss of much historic information. How to identify and protect the significant evidence of dynamic landscapes is a question that must be answered by each cultural community. This paper establishes a decipherment process – an operational guide for landscape assessment in China. This is a methodology using European methods integrated with traditional Chinese ways of landscape appreciation, providing an effective approach to translate the cultural landscape framework into the conservation inventory. Using Slender West Lake as a case study, the decipherment process has expanded the existing landscape investigation theory using the factor of artistic conception to integrate intangible values into the assessment process. It has also established a unit-based method to classify and represent historic landscapes.
Resumo:
Histories of past communities are embedded in landscapes around the world but many are suffering from material change or neglect of their fabric. This study was aimed at discovering and representing the authentic intangible experience of two historic landscapes for conservation purposes. A 2500 year old site in Yangzhou, China and a 2000 year old site on St Helena Island in Moreton Bay were found to be managed under two culturally different regimes of authenticity. This research has contributed to challenging the notion that there is only one way to conserve authenticity in historic landscapes of the Asia Pacific.
Resumo:
The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.
Resumo:
Husserl reminded us of the imperative to return to the Lebensweldt, or life-world. He was preoccupied with the crisis of Western science which alienated the experiencing self from the world of immediate experience. Immediate experience provides a foundation for what it means to be human. Heidegger, building upon these ideas, foresaw a threat to human nature in the face of ‘technicity’. He argued for a return to a relationship between ‘authentic self’ and nature predicated upon the notion of ‘letting be’ in which humans are open to the mystery of being. Self and nature are not conceived as alienated entities but as aspects of a single entity. In modern times, separation between self and the world is further evidenced by scientific rational modes of being exemplified through consumerism and the incessant use of screen-based technology which dominate human experience. In contrast, extreme sports provide an opportunity for people to return to the life-world by living in relation to the natural world. Engagement in extreme sports enables a return to authenticity as we rediscover self as part of nature.
Resumo:
Invasive non-native plants have negatively impacted on biodiversity and ecosystem functions world-wide. Because of the large number of species, their wide distributions and varying degrees of impact, we need a more effective method for prioritizing control strategies for cost-effective investment across heterogeneous landscapes. Here, we develop a prioritization framework that synthesizes scientific data, elicits knowledge from experts and stakeholders to identify control strategies, and appraises the cost-effectiveness of strategies. Our objective was to identify the most cost-effective strategies for reducing the total area dominated by high-impact non-native plants in the Lake Eyre Basin (LEB). We use a case study of the ˜120 million ha Lake Eyre Basin that comprises some of the most distinctive Australian landscapes, including Uluru-Kata Tjuta National Park. More than 240 non-native plant species are recorded in the Lake Eyre Basin, with many predicted to spread, but there are insufficient resources to control all species. Lake Eyre Basin experts identified 12 strategies to control, contain or eradicate non-native species over the next 50 years. The total cost of the proposed Lake Eyre Basin strategies was estimated at AU$1·7 billion, an average of AU$34 million annually. Implementation of these strategies is estimated to reduce non-native plant dominance by 17 million ha – there would be a 32% reduction in the likely area dominated by non-native plants within 50 years if these strategies were implemented. The three most cost-effective strategies were controlling Parkinsonia aculeata, Ziziphus mauritiana and Prosopis spp. These three strategies combined were estimated to cost only 0·01% of total cost of all the strategies, but would provide 20% of the total benefits. Over 50 years, cost-effective spending of AU$2·3 million could eradicate all non-native plant species from the only threatened ecological community within the Lake Eyre Basin, the Great Artesian Basin discharge springs. Synthesis and applications. Our framework, based on a case study of the ˜120 million ha Lake Eyre Basin in Australia, provides a rationale for financially efficient investment in non-native plant management and reveals combinations of strategies that are optimal for different budgets. It also highlights knowledge gaps and incidental findings that could improve effective management of non-native plants, for example addressing the reliability of species distribution data and prevalence of information sharing across states and regions.
Resumo:
The visual characteristics of urban environments have been changing dramatically with the growth of cities around the world. Protection and enhancement of landscape character in urban environments have been one of the challenges for policy makers in addressing sustainable urban growth. Visual openness and enclosure in urban environments are important attributes in perception of visual space which affect the human interaction with physical space and which can be often modified by new developments. Measuring visual openness in urban areas results in more accurate, reliable, and systematic approach to manage and control visual qualities in growing cities. Recent advances in techniques in geographic information systems (GIS) and survey systems make it feasible to measure and quantify this attribute with a high degree of realism and precision. Previous studies in this field do not take full advantage of these improvements. This paper proposes a method to measure the visual openness and enclosure in a changing urban landscape in Australia, on the Gold Coast, by using the improved functionality in GIS. Using this method, visual openness is calculated and described for all publicly accessible areas in the selected study area. A final map is produced which shows the areas with highest visual openness and visibility to natural landscape resources. The output of this research can be used by planners and decision-makers in managing and controlling views in complex urban landscapes. Also, depending on the availability of GIS data, this method can be applied to any region including non-urban landscapes to help planners and policy-makers manage views and visual qualities.
Resumo:
Imagined Landscapes teams geocritical analysis with digital visualization techniques to map and interrogate films, novels, and plays in which space and place figure prominently. Drawing upon A Cultural Atlas of Australia, a database-driven interactive digital map that can be used to identify patterns of representation in Australia’s cultural landscape, the book presents an integrated perspective on the translation of space across narrative forms and pioneers new ways of seeing and understanding landscape. It offers fresh insights on cultural topography and spatial history by examining the technical and conceptual challenges of georeferencing fictional and fictionalized places in narratives. Among the items discussed are Wake in Fright, a novel by Kenneth Cook, adapted iconically to the screen and recently onto the stage; the Australian North as a mythic space; spatial and temporal narrative shifts in retellings of the story of Alexander Pearce, a convict who gained notoriety for resorting to cannibalism after escaping from a remote Tasmanian penal colony; travel narratives and road movies set in Western Australia; and the challenges and spatial politics of mapping spaces for which there are no coordinates.
Resumo:
Background In 2011, a variant of West Nile virus Kunjin strain (WNVKUN) caused an unprecedented epidemic of neurological disease in horses in southeast Australia, resulting in almost 1,000 cases and a 9% fatality rate. We investigated whether increased fitness of the virus in the primary vector, Culex annulirostris, and another potential vector, Culex australicus, contributed to the widespread nature of the outbreak. Methods Mosquitoes were exposed to infectious blood meals containing either the virus strain responsible for the outbreak, designated WNVKUN2011, or WNVKUN2009, a strain of low virulence that is typical of historical strains of this virus. WNVKUN infection in mosquito samples was detected using a fixed cell culture enzyme immunoassay and a WNVKUN- specific monoclonal antibody. Probit analysis was used to determine mosquito susceptibility to infection. Infection, dissemination and transmission rates for selected days post-exposure were compared using Fisher’s exact test. Virus titers in bodies and saliva expectorates were compared using t-tests. Results There were few significant differences between the two virus strains in the susceptibility of Cx. annulirostris to infection, the kinetics of virus replication and the ability of this mosquito species to transmit either strain. Both strains were transmitted by Cx. annulirostris for the first time on day 5 post-exposure. The highest transmission rates (proportion of mosquitoes with virus detected in saliva) observed were 68% for WNVKUN2011 on day 12 and 72% for WNVKUN2009 on day 14. On days 12 and 14 post-exposure, significantly more WNVKUN2011 than WNVKUN2009 was expectorated by infected mosquitoes. Infection, dissemination and transmission rates of the two strains were not significantly different in Culex australicus. However, transmission rates and the amount of virus expectorated were significantly lower in Cx. australicus than Cx. annulirostris. Conclusions The higher amount of WNVKUN2011 expectorated by infected mosquitoes may be an indication that this virus strain is transmitted more efficiently by Cx. annulirostris compared to other WNVKUN strains. Combined with other factors, such as a convergence of abundant mosquito and wading bird populations, and mammalian and avian feeding behaviour by Cx. annulirostris, this may have contributed to the scale of the 2011 equine epidemic.
Resumo:
Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.
Resumo:
Background: Although lentiviral vectors have been widely used for in vitro and in vivo gene therapy researches, there have been few studies systematically examining various conditions that may affect the determination of the number of viable vector particles in a vector preparation and the use of Multiplicity of Infection (MOI) as a parameter for the prediction of gene transfer events. Methods: Lentiviral vectors encoding a marker gene were packaged and supernatants concentrated. The number of viable vector particles was determined by in vitro transduction and fluorescent microscopy and FACs analyses. Various factors that may affect the transduction process, such as vector inoculum volume, target cell number and type, vector decay, variable vector - target cell contact and adsorption periods were studied. MOI between 0-32 was assessed on commonly used cell lines as well as a new cell line. Results: We demonstrated that the resulting values of lentiviral vector titre varied with changes of conditions in the transduction process, including inoculum volume of the vector, the type and number of target cells, vector stability and the length of period of the vector adsorption to target cells. Vector inoculum and the number of target cells determine the frequencies of gene transfer event, although not proportionally. Vector exposure time to target cells also influenced transduction results. Varying these parameters resulted in a greater than 50-fold differences in the vector titre from the same vector stock. Commonly used cell lines in vector titration were less sensitive to lentiviral vector-mediated gene transfer than a new cell line, FRL 19. Within 0-32 of MOI used transducing four different cell lines, the higher the MOI applied, the higher the efficiency of gene transfer obtained. Conclusion: Several variables in the transduction process affected in in vitro vector titration and resulted in vastly different values from the same vector stock, thus complicating the use of MOI for predicting gene transfer events. Commonly used target cell lines underestimated vector titre. However, within a certain range of MOI, it is possible that, if strictly controlled conditions are observed in the vector titration process, including the use of a sensitive cell line, such as FRL 19 for vector titration, lentivector-mediated gene transfer events could be predicted. © 2004 Zhang et al; licensee BioMed Central Ltd.
Resumo:
Polymer nanocomposites offer the potential to create a new type of hybrid material with unique thermal, optical, or electrical properties. Understanding their structure, phase behavior, and dynamics is crucial for realizing such potentials. In this work we provide an experimental insight into the dynamics of such composites in terms of the temperature, wave vector, and volume fraction of nanoparticles, using multispeckle synchrotron x-ray photon correlation spectroscopy measurements on gold nanoparticles embedded in polymethylmethacrylate. Detailed analysis of the intermediate scattering functions reveals possible existence of an intrinsic length scale for dynamic heterogeneity in polymer nanocomposites similar to that seen in other soft materials like colloidal gels and glasses.