991 resultados para ECOLOGICAL NETWORKS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A comprehensive voltage imbalance sensitivity analysis and stochastic evaluation based on the rating and location of single-phase grid-connected rooftop photovoltaic cells (PVs) in a residential low voltage distribution network are presented. The voltage imbalance at different locations along a feeder is investigated. In addition, the sensitivity analysis is performed for voltage imbalance in one feeder when PVs are installed in other feeders of the network. A stochastic evaluation based on Monte Carlo method is carried out to investigate the risk index of the non-standard voltage imbalance in the network in the presence of PVs. The network voltage imbalance characteristic based on different criteria of PV rating and location and network conditions is generalized. Improvement methods are proposed for voltage imbalance reduction and their efficacy is verified by comparing their risk index using Monte Carlo simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the placement and sizing of Distributed Generators (DG) in distribution networks are determined optimally. The objective is to minimize the loss and to improve the reliability. The constraints are the bus voltage, feeder current and the reactive power flowing back to the source side. The placement and size of DGs are optimized using a combination of Discrete Particle Swarm Optimization (DPSO) and Genetic Algorithm (GA). This increases the diversity of the optimizing variables in DPSO not to be stuck in the local minima. To evaluate the proposed algorithm, the semi-urban 37-bus distribution system connected at bus 2 of the Roy Billinton Test System (RBTS), which is located at the secondary side of a 33/11 kV distribution substation, is used. The results finally illustrate the efficiency of the proposed method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional media are under assault from digital technologies. Online advertising is eroding the financial basis of newspapers and television, demarcations between different forms of media are fading, and audiences are fragmenting. We can podcast our favourite radio show, data accompanies television programs, and we catch up with newspaper stories on our laptops. Yet mainstream media remain enormously powerful. The Media and Communications in Australia offers a systematic introduction to this dynamic field. Fully updated and revised to take account of recent developments, this third edition outlines the key media industries and explains how communications technologies are impacting on them. It provides a thorough overview of the main approaches taken in studying the media, and includes new chapters on social media, gaming, telecommunications, sport and cultural diversity. With contributions from some of Australia's best researchers and teachers in the field, The Media and Communications in Australia is the most comprehensive and reliable introduction to media and communications available. It is an ideal student text, and a reference for teachers of media and anyone interested in this influential industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rodenticide use in agriculture can lead to the secondary poisoning of avian predators. Currently the Australian sugarcane industry has two rodenticides, Racumin® and Rattoff®, available for in-crop use but, like many agricultural industries, it lacks an ecologically-based method of determining the potential secondary poisoning risk the use of these rodenticides poses to avian predators. The material presented in this thesis addresses this by: a. determining where predator/prey interactions take place in sugar producing districts; b. quantifying the amount of rodenticide available to avian predators and the probability of encounter; and c. developing a stochastic model that allows secondary poisoning risk under various rodenticide application scenarios to be investigated. Results demonstrate that predator/prey interactions are highly constrained by environmental structure. Rodents used crops that provided high levels of canopy cover and therefore predator protection and poorly utilised open canopy areas. In contrast, raptors over-utilised areas with low canopy cover and low rodent densities, but which provided high accessibility to prey. Given this pattern of habitat use, and that industry baiting protocols preclude rodenticide application in open canopy crops, these results indicate that secondary poisoning can only occur if poisoned rodents leave closed canopy crops and become available for predation in open canopy areas. Results further demonstrate that after in-crop rodenticide application, only a small proportion of rodents available in open areas are poisoned and that these rodents carry low levels of toxicant. Coupled with the low level of rodenticide use in the sugar industry, the high toxic threshold raptors have to these toxicants and the low probability of encountering poisoned rodents, results indicate that the risk of secondary poisoning events occurring is minimal. A stochastic model was developed to investigate the effect of manipulating factors that might influence secondary poisoning hazard in a sugarcane agro-ecosystem. These simulations further suggest that in all but extreme scenarios, the risk of secondary poisoning is also minimal. Collectively, these studies demonstrate that secondary poisoning of avian predators associated with the use of the currently available rodenticides in Australian sugar producing districts is minimal. Further, the ecologically-based method of assessing secondary poisoning risk developed in this thesis has broader applications in other agricultural systems where rodenticide use may pose risks to avian predators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of this PhD was to further develop Bayesian spatio-temporal models (specifically the Conditional Autoregressive (CAR) class of models), for the analysis of sparse disease outcomes such as birth defects. The motivation for the thesis arose from problems encountered when analyzing a large birth defect registry in New South Wales. The specific components and related research objectives of the thesis were developed from gaps in the literature on current formulations of the CAR model, and health service planning requirements. Data from a large probabilistically-linked database from 1990 to 2004, consisting of fields from two separate registries: the Birth Defect Registry (BDR) and Midwives Data Collection (MDC) were used in the analyses in this thesis. The main objective was split into smaller goals. The first goal was to determine how the specification of the neighbourhood weight matrix will affect the smoothing properties of the CAR model, and this is the focus of chapter 6. Secondly, I hoped to evaluate the usefulness of incorporating a zero-inflated Poisson (ZIP) component as well as a shared-component model in terms of modeling a sparse outcome, and this is carried out in chapter 7. The third goal was to identify optimal sampling and sample size schemes designed to select individual level data for a hybrid ecological spatial model, and this is done in chapter 8. Finally, I wanted to put together the earlier improvements to the CAR model, and along with demographic projections, provide forecasts for birth defects at the SLA level. Chapter 9 describes how this is done. For the first objective, I examined a series of neighbourhood weight matrices, and showed how smoothing the relative risk estimates according to similarity by an important covariate (i.e. maternal age) helped improve the model’s ability to recover the underlying risk, as compared to the traditional adjacency (specifically the Queen) method of applying weights. Next, to address the sparseness and excess zeros commonly encountered in the analysis of rare outcomes such as birth defects, I compared a few models, including an extension of the usual Poisson model to encompass excess zeros in the data. This was achieved via a mixture model, which also encompassed the shared component model to improve on the estimation of sparse counts through borrowing strength across a shared component (e.g. latent risk factor/s) with the referent outcome (caesarean section was used in this example). Using the Deviance Information Criteria (DIC), I showed how the proposed model performed better than the usual models, but only when both outcomes shared a strong spatial correlation. The next objective involved identifying the optimal sampling and sample size strategy for incorporating individual-level data with areal covariates in a hybrid study design. I performed extensive simulation studies, evaluating thirteen different sampling schemes along with variations in sample size. This was done in the context of an ecological regression model that incorporated spatial correlation in the outcomes, as well as accommodating both individual and areal measures of covariates. Using the Average Mean Squared Error (AMSE), I showed how a simple random sample of 20% of the SLAs, followed by selecting all cases in the SLAs chosen, along with an equal number of controls, provided the lowest AMSE. The final objective involved combining the improved spatio-temporal CAR model with population (i.e. women) forecasts, to provide 30-year annual estimates of birth defects at the Statistical Local Area (SLA) level in New South Wales, Australia. The projections were illustrated using sixteen different SLAs, representing the various areal measures of socio-economic status and remoteness. A sensitivity analysis of the assumptions used in the projection was also undertaken. By the end of the thesis, I will show how challenges in the spatial analysis of rare diseases such as birth defects can be addressed, by specifically formulating the neighbourhood weight matrix to smooth according to a key covariate (i.e. maternal age), incorporating a ZIP component to model excess zeros in outcomes and borrowing strength from a referent outcome (i.e. caesarean counts). An efficient strategy to sample individual-level data and sample size considerations for rare disease will also be presented. Finally, projections in birth defect categories at the SLA level will be made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Alzaid et al. proposed a forward & backward secure key management scheme in wireless sensor networks for Process Control Systems (PCSs) or Supervisory Control and Data Acquisition (SCADA) systems. The scheme, however, is still vulnerable to an attack called the sandwich attack that can be launched when the adversary captures two sensor nodes at times t1 and t2, and then reveals all the group keys used between times t1 and t2. In this paper, a fix to the scheme is proposed in order to limit the vulnerable time duration to an arbitrarily chosen time span while keeping the forward and backward secrecy of the scheme untouched. Then, the performance analysis for our proposal, Alzaid et al.’s scheme, and Nilsson et al.’s scheme is given.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Principal Topic: ''In less than ten years music labels will not exist anymore.'' Michael Smelli, former Global COO Sony/BMG MCA/QUT IMP Business Lab Digital Music Think Thanks 9 May 2009, Brisbane Big music labels such as EMI, Sony BMG and UMG have been responsible for promoting and producing a myriad of stars in the music industry over the last decades. However, the industry structure is under enormous threat with the emergence of a new innovative era of digital music. Recent years have seen a dramatic shift in industry power with the emergence of Napster and other file sharing sites, iTunes and other online stores, iPod and the MP3 revolution. Myspace.com and other social networking sites are connecting entrepreneurial artists with fans and creating online music communities independent of music labels. In 2008 the digital music business internationally grew by around 25% to 3.7 Billion US-Dollar. Digital platforms now account for around 20% of recorded music sales, up from 15 % in 2007 (IFPI Digital music report 2009). CD sales have fallen by 40% since their peak levels. Global digital music sales totalled an estimated US$ 3 Billion in 2007, an increase of 40% on 2006 figures. Digital sales account for an estimated 15% of global market, up from 11% in 2006 and zero in 2003. The music industry is more advanced in terms of digital revenues than any other creative or entertainment industry (except games). Its digital share is more than twice that of newspapers (7%), films (35) or books (2%). All these shifts present new possibilities for music entrepreneurs to act entrepreneurially and promote their music independently of the major music labels. Diffusion of innovations has a long tradition in both sociology (e.g. Rogers 1962, 2003) and marketing (Bass 1969, Mahajan et al., 1990). The context of the current project is theoretically interesting in two respects. First, the role of online social networks replaces traditional face-to-face word of mouth communications. Second, as music is a hedonistic product, this strongly influences the nature of interpersonal communications and their diffusion patterns. Both of these have received very little attention in the diffusion literature to date, and no studies have investigated the influence of both simultaneously. This research project is concerned with the role of social networks in this new music industry landscape, and how this may be leveraged by musicians willing to act entrepreneurially. Our key research question we intend to address is: How do online social network communities impact the nature, pattern and speed that music diffuses? Methodology/Key Propositions : We expect the nature/ character of diffusion of popular, generic music genres to be different from specialized, niche music. To date, only Moe & Fader (2002) and Lee et al. (2003) investigated diffusion patterns of music and these focus on forecast weekly sales of music CDs based on the advance purchase orders before the launch, rather than taking a detailed look at diffusion patterns. Consequently, our first research questions are concerned with understanding the nature of online communications within the context of diffusion of music and artists. Hence, we have the following research questions: RQ1: What is the nature of fan-to-fan ''word of mouth'' online communications for music? Do these vary by type of artist and genre of music? RQ2: What is the nature of artist-to-fan online communications for music? Do these vary by type of artist and genre of music? What types of communication are effective? Two outcomes from research social network theory are particularly relevant to understanding how music might diffuse through social networks. Weak tie theory (Granovetter, 1973), argues that casual or infrequent contacts within a social network (or weak ties) act as a link to unique information which is not normally contained within an entrepreneurs inner circle (or strong tie) social network. A related argument, structural hole theory (Burt, 1992), posits that it is the absence of direct links (or structural holes) between members of a social network which offers similar informational benefits. Although these two theories argue for the information benefits of casual linkages, and diversity within a social network, others acknowledge that a balanced network which consists of a mix of strong ties, weak ties is perhaps more important overall (Uzzi, 1996). It is anticipated that the network structure of the fan base for different types of artists and genres of music will vary considerably. This leads to our third research question: RQ3: How does the network structure of online social network communities impact the pattern and speed that music diffuses? The current paper is best described as theory elaboration. It will report the first exploratory phase designed to develop and elaborate relevant theory (the second phase will be a quantitative study of network structure and diffusion). We intend to develop specific research propositions or hypotheses from the above research questions. To do so we will conduct three focus group discussions of independent musicians and three focus group discussions of fans active in online music communication on social network sites. We will also conduct five case studies of bands that have successfully built fan bases through social networking sites (e.g. myspace.com, facebook.com). The idea is to identify which communication channels they employ and the characteristics of the fan interactions for different genres of music. We intend to conduct interviews with each of the artists and analyse their online interaction with their fans. Results and Implications : At the current stage, we have just begun to conduct focus group discussions. An analysis of the themes from these focus groups will enable us to further refine our research questions into testable hypotheses. Ultimately, our research will provide a better understanding of how social networks promote the diffusion of music, and how this varies for different genres of music. Hence, some music entrepreneurs will be able to promote their music more effectively. The results may be further generalised to other industries where online peer-to-peer communication is common, such as other forms of entertainment and consumer technologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Poly(L-lactide-co-succinic anhydride) networks were synthesised via the carbodiimide-mediated coupling of poly(L-lactide) (PLLA) star polymers. When 4-(dimethylamino)pyridine (DMAP) alone was used as the catalyst gelation did not occur. However, when 4-(dimethylamino)pyridinium p-toluenesulfonate (DPTS), the salt of DMAP and p-toluenesulfonic acid (PTSA), was the catalyst, the networks obtained had gel fractions comparable to those which were reported for networks synthesised by conventional methods. Greater gel fractions and conversion of the prepolymer terminal hydroxyl groups were observed when the hydroxyl-terminated star prepolymers reacted with succinic anhydride in a one-pot procedure than when the hydroxyl-terminated star prepolymers reacted with presynthesised succinic-terminated star prepolymers. The thermal properties of the networks, glass transition temperature (Tg), melting temperature (Tm) and crystallinity (Xc) were all strongly influenced by the average molecular weights between the crosslinks ((M_c). The network with the smallest (M_c )(1400 g/mol) was amorphous and had a Tg of 59 °C while the network with the largest (M_c ) (7800 g/mol) was 15 % crystalline and had a Tg of 56 °C.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present algorithms, systems, and experimental results for underwater data muling. In data muling a mobile agent interacts with static agents to upload, download, or transport data to a different physical location. We consider a system comprising an Autonomous Underwater Vehicle (AUV) and many static Underwater Sensor Nodes (USN) networked together optically and acoustically. The AUV can locate the static nodes using vision and hover above the static nodes for data upload. We describe the hardware and software architecture of this underwater system, as well as experimental data. © 2006 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While sensor networks have now become very popular on land, the underwater environment still poses some difficult problems. Communication is one of the difficult challenges under water. There are two options: optical and acoustic. We have designed an optical communication board that allows the Fleck’s to communicate optically. We have tested the resulting underwater sensor nodes in two different applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the authors propose a new structure for the decoupling of circulant symmetric arrays of more than four elements. In this case, network element values are again obtained through a process of repeated eigenmode decoupling, here by solving sets of nonlinear equations. However, the resulting circuit is much simpler and can be implemented on a single layer. The corresponding circuit topology for the 6-element array is displayed in figure diagrams. The procedure will be illustrated by considering different examples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secret-sharing schemes describe methods to securely share a secret among a group of participants. A properly constructed secret-sharing scheme guarantees that the share belonging to one participant does not reveal anything about the shares of others or even the secret itself. Besides the obvious feature which is to distribute a secret, secret-sharing schemes have also been used in secure multi-party computations and redundant residue number systems for error correction codes. In this paper, we propose that the secret-sharing scheme be used as a primitive in a Network-based Intrusion Detection System (NIDS) to detect attacks in encrypted networks. Encrypted networks such as Virtual Private Networks (VPNs) fully encrypt network traffic which can include both malicious and non-malicious traffic. Traditional NIDS cannot monitor encrypted traffic. Our work uses a combination of Shamir's secret-sharing scheme and randomised network proxies to enable a traditional NIDS to function normally in a VPN environment. In this paper, we introduce a novel protocol that utilises a secret-sharing scheme to detect attacks in encrypted networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ad hoc networks are vulnerable to attacks due to distributed nature and lack of infrastructure. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. The clustering protocols can be taken as an additional advantage in these processing constrained networks to collaboratively detect intrusions with less power usage and minimal overhead. Existing clustering protocols are not suitable for intrusion detection purposes, because they are linked with the routes. The route establishment and route renewal affects the clusters and as a consequence, the processing and traffic overhead increases due to instability of clusters. The ad hoc networks are battery and power constraint, and therefore a trusted monitoring node should be available to detect and respond against intrusions in time. This can be achieved only if the clusters are stable for a long period of time. If the clusters are regularly changed due to routes, the intrusion detection will not prove to be effective. Therefore, a generalized clustering algorithm has been proposed that can run on top of any routing protocol and can monitor the intrusions constantly irrespective of the routes. The proposed simplified clustering scheme has been used to detect intrusions, resulting in high detection rates and low processing and memory overhead irrespective of the routes, connections, traffic types and mobility of nodes in the network. Clustering is also useful to detect intrusions collaboratively since an individual node can neither detect the malicious node alone nor it can take action against that node on its own.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mobile ad-hoc networks (MANETs) are temporary wireless networks useful in emergency rescue services, battlefields operations, mobile conferencing and a variety of other useful applications. Due to dynamic nature and lack of centralized monitoring points, these networks are highly vulnerable to attacks. Intrusion detection systems (IDS) provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other nodes. We take benefit of the clustering concept in MANETs for the effective communication between nodes, where each cluster involves a number of member nodes and is managed by a cluster-head. It can be taken as an advantage in these battery and memory constrained networks for the purpose of intrusion detection, by separating tasks for the head and member nodes, at the same time providing opportunity for launching collaborative detection approach. The clustering schemes are generally used for the routing purposes to enhance the route efficiency. However, the effect of change of a cluster tends to change the route; thus degrades the performance. This paper presents a low overhead clustering algorithm for the benefit of detecting intrusion rather than efficient routing. It also discusses the intrusion detection techniques with the help of this simplified clustering scheme.