87 resultados para Large-scale Distribution

em Deakin Research Online - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we have demonstrated that a rotating metal wire coil can be used as a nozzle to electrospin nanofibers on a large-scale. Without using any needles, the rotating wire coil, partially immersed in a polymer solution reservoir, can pick up a thin layer of charged polymer solution and generate a large number of nanofibers from the wire surface simultaneously. This arrangement significantly increases the nanofiber productivity. The fiber productivity was found to be determined by the coil dimensions, applied voltage and polymer concentration. The dependency of fiber diameter on the polymer concentration showed a similar trend to that for a conventional electrospinning system using a syringe needle nozzle, but the coil electrospun fibers were thinner with narrower diameter distribution. The profiles of electric field strength in the coil electrospinning was calculated and showed concentrated electric field intensity on the wire surface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Determining the biological and environmental factors that limit the distribution and abundance of organisms is central to our understanding of the niche concept and crucial for predicting how species may respond to large-scale environmental change, such as global warming. However, detailed ecological information for the majority of species has been collected only at a local scale, and insufficient consideration has been given to geographical variation in intraspecific niche requirements. To evaluate the influence of environmental and biological factors on patterns of species distribution and abundance, we conducted a detailed, broadscale study across the tropical savannas of northern Australia on the ecology of three large, sympatric marsupial herbivores (family Macropodidae): the antilopine wallaroo (Macropus antilopinus), common wallaroo (M. robustus), and eastern grey kangaroo (M. giganteus). Using information on species abundance, climate, fire history, habitat, and resource availability, we constructed species' habitat models varying from the level of the complete distribution to smaller regional areas. Multiple factors affected macropod abundance, and the importance of these factors was dependent on the spatial scale of analyses. Fire regimes, water availability, geology, and soil type and climate were most important at the large scale, whereas aspects of habitat structure and interspecific species abundance were important at smaller scales. The distribution and abundance of eastern grey kangaroos and common wallaroos were strongly influenced by climate. Our results suggest that interspecific competition between antilopine wallaroos and eastern grey kangaroos may occur. The antilopine wallaroo and eastern grey kangaroo (grazers) preferred more nutrient-rich soils than the common wallaroo (grazer/browser), which we relate to differences in feeding modes. The abundance of antilopine wallaroos was higher on sites that were burned, whereas the abundance of common wallaroos was higher on unburned sites. Future climate change predicted for Australia has the capacity to seriously affect the abundance and conservation of macropod species in tropical savannas. The results of our models suggest that, in particular, the effects of changing climatic conditions on fire regimes, habitat structure, and water availability may lead to species declines and marked changes in macropod communities.


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a novel framework for large-scale scene understanding in static camera surveillance. Our techniques combine fast rank-1 constrained robust PCA to compute the foreground, with non-parametric Bayesian models for inference. Clusters are extracted in foreground patterns using a joint multinomial+Gaussian Dirichlet process model (DPM). Since the multinomial distribution is normalized, the Gaussian mixture distinguishes between similar spatial patterns but different activity levels (eg. car vs bike). We propose a modification of the decayed MCMC technique for incremental inference, providing the ability to discover theoretically unlimited patterns in unbounded video streams. A promising by-product of our framework is online, abnormal activity detection. A benchmark video and two surveillance videos, with the longest being 140 hours long are used in our experiments. The patterns discovered are as informative as existing scene understanding algorithms. However, unlike existing work, we achieve near real-time execution and encouraging performance in abnormal activity detection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At-sea distributions of large scyphozoan jellyfish across the Irish Sea were studied using visual surface counts from ships of opportunity. Thirty-seven surveys were conducted along two >100 km long transects between Ireland and the UK from April to September in 2009 and 2010. Five species were recorded but only Aurelia aurita and Cyanea capillata were frequently observed. The first formal description of the seasonal changes in the abundances and distributions of these two species in the study area is provided. The highest densities of these species were more likely to be found ~30 km offshore, but large aggregations were present both in coastal and offshore waters. Evidence for aggregations of medusae along physical discontinuities was provided by coupling jellyfish observations with simultaneous records of environmental parameters. The value of surveys from ships of opportunity as cost-effective semi-quantitative tools, to develop local knowledge on jellyfish abundance, distribution, and phenology is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Jellyfish (medusae) are sometimes the most noticeable and abundant members of coastal planktonic communities, yet ironically, this high conspicuousness is not reflected in our overall understanding of their spatial distributions across large expanses of water. Here, we set out to elucidate the spatial (and temporal) patterns for five jellyfish species (Phylum Cnidaria, Orders Rhizostomeae and Semaeostomeae) across the Irish & Celtic Seas, an extensive shelf-sea area at Europe’s northwesterly margin encompassing several thousand square kilometers. Data were gathered using two independent methods: (1) surface-counts of jellyfish from ships of opportunity, and (2) regular shoreline surveys for stranding events over three consecutive years. Jellyfish species displayed distinct species-specific distributions, with an apparent segregation of some species. Furthermore, a different species composition was noticeable between the northern and southern parts of the study area. Most importantly, our data suggests that jellyfish distributions broadly reflect the major hydrographic regimes (and associated physical discontinuities) of the study area, with mixed water masses possibly acting as a trophic barrier or non-favourable environment for the successful growth and reproduction of jellyfish species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change is expected to have a number of impacts on biological communities including range extensions and contractions. Recent analyses of multidecadal data sets have shown such monotonic shifts in the distribution of plankton communities and various fish species, both groups for which there is a large amount of historical data on distribution. However, establishing the implications of climate change for the range of endangered species is problematic as historic data are often lacking. We therefore used a different approach to predict the implications of climate change for the range of the critically endangered planktivourous leatherback turtle (Dermochelys coriacea). We used long-term satellite telemetry to define the habitat utilization of this species. We show that the northerly distribution limit of this species can essentially be encapsulated by the position of the 15°C isotherm and that the summer position of this isotherm has moved north by 330 km in the North Atlantic in the last 17 years. Consequently, conservation measures will need to operate over ever-widening areas to accommodate this range extension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a practical and cost-effective approach to construct a fully distributed roadside communication infrastructure to facilitate the localized content dissemination to vehicles in the urban area. The proposed infrastructure is composed of distributed lightweight low-cost devices called roadside buffers (RSBs), where each RSB has the limited buffer storage and is able to transmit wirelessly the cached contents to fast-moving vehicles. To enable the distributed RSBs working toward the global optimal performance (e.g., minimal average file download delays), we propose a fully distributed algorithm to determine optimally the content replication strategy at RSBs. Specifically, we first develop a generic analytical model to evaluate the download delay of files, given the probability density of file distribution at RSBs. Then, we formulate the RSB content replication process as an optimization problem and devise a fully distributed content replication scheme accordingly to enable vehicles to recommend intelligently the desirable content files to RSBs. The proposed infrastructure is designed to optimize the global network utility, which accounts for the integrated download experience of users and the download demands of files. Using extensive simulations, we validate the effectiveness of the proposed infrastructure and show that the proposed distributed protocol can approach to the optimal performance and can significantly outperform the traditional heuristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predation and fire shape the structure and function of ecosystems globally. However, studies exploring interactions between these two processes are rare, especially at large spatial scales. This knowledge gap is significant not only for ecological theory, but also in an applied context, because it limits the ability of landscape managers to predict the outcomes of manipulating fire and predators. We examined the influence of fire on the occurrence of an introduced and widespread mesopredator, the red fox (Vulpes vulpes), in semi-arid Australia. We used two extensive and complimentary datasets collected at two spatial scales. At the landscape-scale, we surveyed red foxes using sand-plots within 28 study landscapes - which incorporated variation in the diversity and proportional extent of fire-age classes - located across a 104 000 km2 study area. At the site-scale, we surveyed red foxes using camera traps at 108 sites stratified along a century-long post-fire chronosequence (0-105 years) within a 6630 km2 study area. Red foxes were widespread both at the landscape and site-scale. Fire did not influence fox distribution at either spatial scale, nor did other environmental variables that we measured. Our results show that red foxes exploit a broad range of environmental conditions within semi-arid Australia. The presence of red foxes throughout much of the landscape is likely to have significant implications for native fauna, particularly in recently burnt habitats where reduced cover may increase prey species' predation risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Malware is pervasive in networks, and poses a critical threat to network security. However, we have very limited understanding of malware behavior in networks to date. In this paper, we investigate how malware propagates in networks from a global perspective. We formulate the problem, and establish a rigorous two layer epidemic model for malware propagation from network to network. Based on the proposed model, our analysis indicates that the distribution of a given malware follows exponential distribution, power law distribution with a short exponential tail, and power law distribution at its early, late and final stages, respectively. Extensive experiments have been performed through two real-world global scale malware data sets, and the results confirm our theoretical findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much of the research that has been carried out into outsourcing is based on relatively successful case studies. Yet drawing inferences from case studies when those with largely negative outcomes rarely see the light of day represents a significant problem. When negative cases are systematically unrepresented, there is less opportunity to subject theory to scrutiny. This chapter goes some way towards redressing this trend, by reporting on a large scale “selective” outsourcing arrangement that has been publicly described as a failure — the Australian Federal Government’s “whole of government” IT infrastructure outsourcing initiative. This initiative, originally promoted as likely to lead to a billion dollar saving, was abandoned early in 2001, after a damning public report by the Australian Auditor General. However, a detailed study of the initiative suggests that the “failure” occurred despite the project adhering to many of the recommended guidelines for successful outsourcing that had been derived from earlier case analysis. The findings have important implications for decision makers confronted with outsourcing choices. The study suggests that the risks of outsourcing are often downplayed, or ignored in the rush to reap the expected benefits. The study also suggests that expectations of savings from outsourcing IT are often substantially higher than those that have been empirically confirmed in the field. Decision makers are advised that key assumptions about costs, savings, managerial effort, and the effects of outsourcing on operational performance might be incorrect, and to plan for their outsourcing activity accordingly. They should pay particular attention to coordination and transaction costs, as these tend to be overlooked in the business case. These costs will be magnified if “best in breed” multiple-vendor outsourcing is chosen, and if contracts are kept short. Decision-makers are also warned of the difficulties they are likely to have at the end of an outsourcing contract if there is not a large and robust pool of alternative vendors willing to bid against the incumbent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large-scale sequence assembly and alignment are fundamental parts of biological computing. However, most of the large-scale sequence assembly and alignment require intensive computing power and normally take very long time to complete. To speedup the assembly and alignment process, this paper parallelizes the Euler sequence assembly and pair-wise/multiple sequence assembly, two important sequence assembly methods, and takes advantage of Computing Grid which has a colossal computing capacity to meet the large-scale biological computing demand.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, I will argue that it is possible to use data from large-scale international and national mathematics assessment programmes, whose attention is on summative achievement, to provide formative information that informs teachers about the effects of their classroom practice. However, to have impact on, and be useful for, classroom practitioners, these achievement data need to be reworked and re-presented in ways that are plausible, provide a basis for inferences about practice, and be appropriate for the intended audience. This paper examines achievement-focused assessment programmes in terms of their aims and approaches, and develops the argument that formative assessment possibilities are present, within these programmes, although usually hidden. Examples are drawn from several sources to support this argument, and demonstrate a variety of approaches that have been taken in the past. Suggestions for further action are made.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biological sequence assembly is an essential step for sequencing the genomes of organisms. Sequence assembly is very computing intensive especially for the large-scale sequence assembly. Parallel computing is an effective way to reduce the computing time and support the assembly for large amount of biological fragments. Euler sequence assembly algorithm is an innovative algorithm proposed recently. The advantage of this algorithm is that its computing complexity is polynomial and it provides a better solution to the notorious “repeat” problem. This paper introduces the parallelization of the Euler sequence assembly algorithm. All the Genome fragments generated by whole genome shotgun (WGS) will be assembled as a whole rather than dividing them into groups which may incurs errors due to the inaccurate group partition. The implemented system can be run on supercomputers, network of workstations or even network of PC computers. The experimental results have demonstrated the performance of our system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we present a new approach, called Flexible Deterministic Packet Marking (FDPM), to perform a large-scale IP traceback to defend against Distributed Denial of Service (DDoS) attacks. In a DDoS attack the victim host or network is usually attacked by a large number of spoofed IP packets coming from multiple sources. IP traceback is the ability to trace the IP packets to their sources without relying on the source address field of the IP header. FDPM provides many flexible features to trace the IP packets and can obtain better tracing capability than current IP traceback mechanisms, such as Probabilistic Packet Marking (PPM), and Deterministic Packet Marking (DPM). The flexibilities of FDPM are in two ways, one is that it can adjust the length of marking field according to the network protocols deployed; the other is that it can adjust the marking rate according to the load of participating routers. The implementation and evaluation demonstrates that the FDPM needs moderately only a small number of packets to complete the traceback process; and can successfully perform a large-scale IP traceback, for example, trace up to 110,000 sources in a single incident response. It has a built-in overload prevention mechanism, therefore this scheme can perform a good traceback process even it is heavily loaded.