60 resultados para Router ottico, Click, Reti ottiche, linux


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aided by the development of information technology, the balance of power in the market place is rapidly shifting from marketers towards consumers and nowhere is this more obvious than in the online environment (Denegri-Knott, Zwick, & Schroeder, 2006; Moynagh & Worsley, 2002; Newcomer, 2000; Samli, 2001). From the inception and continuous development of the Internet, consumers are becoming more empowered. They can choose what they want to click on the Internet, they can shop and transact payments, watch and download video, chat with others, be it friends or even total strangers. Especially in online communities, like-minded consumers share and exchange information, ideas and opinions. One form of online community is the online brand community, which gathers specific brand lovers. As with any social unit, people form different roles in the community and exert different effects on each other. Their interaction online can greatly influence the brand and marketers. A comprehensive understanding of the operation of this special group form is essential to advancing marketing thought and practice (Kozinets, 1999). While online communities have strongly shifted the balance of power from marketers to consumers, the current marketing literature is sparse on power theory (Merlo, Whitwell, & Lukas, 2004). Some studies have been conducted from an economic point of view (Smith, 1987), however their application to marketing has been limited. Denegri-Knott (2006) explored power based on the struggle between consumers and marketers online and identified consumer power formats such as control over the relationship, information, aggregation and participation. Her study has built a foundation for future power studies in the online environment. This research project bridges the limited marketing literature on power theory with the growing recognition of online communities among marketing academics and practitioners. Specifically, this study extends and redefines consumer power by exploring the concept of power in online brand communities, in order to better understand power structure and distribution in this context. This research investigates the applicability of the factors of consumer power identified by Denegri-Knott (2006) to the online brand community. In addition, by acknowledging the model proposed by McAlexander, Schouten, & Koenig (2002), which emphasized that community study should focus on the role of consumers and identifying multiple relationships among the community, this research further explores how member role changes will affect power relationships as well as consumer likings of the brand. As a further extension to the literature, this study also considers cultural differences and their effect on community member roles and power structure. Based on the study of Hofstede (1980), Australia and China were chosen as two distinct samples to represent differences in two cultural dimensions, namely individualism verses collectivism and high power distance verses low power distance. This contribution to the research also helps answer the research gap identified by Muñiz Jr & O'Guinn (2001), who pointed out the lack of cross cultural studies within the online brand community context. This research adopts a case study methodology to investigate the issues identified above. Case study is an appropriate research strategy to answer “how” and “why” questions of a contemporary phenomenon in real-life context (Yin, 2003). The online brand communities of “Haloforum.net” in Australia and “NGA.cn” in China were selected as two cases. In-depth interviews were used as the primary data collection method. As a result of the geographical dispersion and the preference of a certain number of participants, online synchronic interviews via MSN messenger were utilized along with the face-to-face interviews. As a supplementary approach, online observation was carried over two months, covering a two week period prior to the interviews and a six week period following the interviews. Triangulation techniques were used to strengthen the credibility and validity of the research findings (Yin, 2003). The findings of this research study suggest a new definition of power in an online brand community. This research also redefines the consumer power types and broadens the brand community model developed by McAlexander et al. (2002) in an online context by extending the various relationships between brand and members. This presents a more complete picture of how the perceived power relationships are structured in the online brand community. A new member role is discovered in the Australian online brand community in addition to the four member roles identified by Kozinets (1999), in contrast however, all four roles do not exist in the Chinese online brand community. The research proposes a model which links the defined power types and identified member roles. Furthermore, given the results of the cross-cultural comparison between Australia and China showed certain discrepancies, the research suggests that power studies in the online brand community should be country-specific. This research contributes to the body of knowledge on online consumer power, by applying it to the context of an online brand community, as well as considering factors such as cross cultural difference. Importantly, it provides insights for marketing practitioners on how to best leverage consumer power to serve brand objective in online brand communities. This, in turn, should lead to more cost effective and successful communication strategies. Finally, the study proposes future research directions. The research should be extended to communities of different sizes, to different extents of marketer control over the community, to the connection between online and offline activities within the brand community, and (given the cross-cultural findings) to different countries. In addition, a greater amount of research in this area is recommended to determine the generalizability of this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Key topics: Since the birth of the Open Source movement in the mid-80's, open source software has become more and more widespread. Amongst others, the Linux operating system, the Apache web server and the Firefox internet explorer have taken substantial market shares to their proprietary competitors. Open source software is governed by particular types of licenses. As proprietary licenses only allow the software's use in exchange for a fee, open source licenses grant users more rights like the free use, free copy, free modification and free distribution of the software, as well as free access to the source code. This new phenomenon has raised many managerial questions: organizational issues related to the system of governance that underlie such open source communities (Raymond, 1999a; Lerner and Tirole, 2002; Lee and Cole 2003; Mockus et al. 2000; Tuomi, 2000; Demil and Lecocq, 2006; O'Mahony and Ferraro, 2007;Fleming and Waguespack, 2007), collaborative innovation issues (Von Hippel, 2003; Von Krogh et al., 2003; Von Hippel and Von Krogh, 2003; Dahlander, 2005; Osterloh, 2007; David, 2008), issues related to the nature as well as the motivations of developers (Lerner and Tirole, 2002; Hertel, 2003; Dahlander and McKelvey, 2005; Jeppesen and Frederiksen, 2006), public policy and innovation issues (Jullien and Zimmermann, 2005; Lee, 2006), technological competitions issues related to standard battles between proprietary and open source software (Bonaccorsi and Rossi, 2003; Bonaccorsi et al. 2004, Economides and Katsamakas, 2005; Chen, 2007), intellectual property rights and licensing issues (Laat 2005; Lerner and Tirole, 2005; Gambardella, 2006; Determann et al., 2007). A major unresolved issue concerns open source business models and revenue capture, given that open source licenses imply no fee for users. On this topic, articles show that a commercial activity based on open source software is possible, as they describe different possible ways of doing business around open source (Raymond, 1999; Dahlander, 2004; Daffara, 2007; Bonaccorsi and Merito, 2007). These studies usually look at open source-based companies. Open source-based companies encompass a wide range of firms with different categories of activities: providers of packaged open source solutions, IT Services&Software Engineering firms and open source software publishers. However, business models implications are different for each of these categories: providers of packaged solutions and IT Services&Software Engineering firms' activities are based on software developed outside their boundaries, whereas commercial software publishers sponsor the development of the open source software. This paper focuses on open source software publishers' business models as this issue is even more crucial for this category of firms which take the risk of investing in the development of the software. Literature at last identifies and depicts only two generic types of business models for open source software publishers: the business models of ''bundling'' (Pal and Madanmohan, 2002; Dahlander 2004) and the dual licensing business models (Välimäki, 2003; Comino and Manenti, 2007). Nevertheless, these business models are not applicable in all circumstances. Methodology: The objectives of this paper are: (1) to explore in which contexts the two generic business models described in literature can be implemented successfully and (2) to depict an additional business model for open source software publishers which can be used in a different context. To do so, this paper draws upon an explorative case study of IdealX, a French open source security software publisher. This case study consists in a series of 3 interviews conducted between February 2005 and April 2006 with the co-founder and the business manager. It aims at depicting the process of IdealX's search for the appropriate business model between its creation in 2000 and 2006. This software publisher has tried both generic types of open source software publishers' business models before designing its own. Consequently, through IdealX's trials and errors, I investigate the conditions under which such generic business models can be effective. Moreover, this study describes the business model finally designed and adopted by IdealX: an additional open source software publisher's business model based on the principle of ''mutualisation'', which is applicable in a different context. Results and implications: Finally, this article contributes to ongoing empirical work within entrepreneurship and strategic management on open source software publishers' business models: it provides the characteristics of three generic business models (the business model of bundling, the dual licensing business model and the business model of mutualisation) as well as conditions under which they can be successfully implemented (regarding the type of product developed and the competencies of the firm). This paper also goes further into the traditional concept of business model used by scholars in the open source related literature. In this article, a business model is not only considered as a way of generating incomes (''revenue model'' (Amit and Zott, 2001)), but rather as the necessary conjunction of value creation and value capture, according to the recent literature about business models (Amit and Zott, 2001; Chresbrough and Rosenblum, 2002; Teece, 2007). Consequently, this paper analyses the business models from these two components' point of view.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we use time series analysis to evaluate predictive scenarios using search engine transactional logs. Our goal is to develop models for the analysis of searchers’ behaviors over time and investigate if time series analysis is a valid method for predicting relationships between searcher actions. Time series analysis is a method often used to understand the underlying characteristics of temporal data in order to make forecasts. In this study, we used a Web search engine transactional log and time series analysis to investigate users’ actions. We conducted our analysis in two phases. In the initial phase, we employed a basic analysis and found that 10% of searchers clicked on sponsored links. However, from 22:00 to 24:00, searchers almost exclusively clicked on the organic links, with almost no clicks on sponsored links. In the second and more extensive phase, we used a one-step prediction time series analysis method along with a transfer function method. The period rarely affects navigational and transactional queries, while rates for transactional queries vary during different periods. Our results show that the average length of a searcher session is approximately 2.9 interactions and that this average is consistent across time periods. Most importantly, our findings shows that searchers who submit the shortest queries (i.e., in number of terms) click on highest ranked results. We discuss implications, including predictive value, and future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The Google Online Marketing Challenge is a global competition in which student teams run advertising campaigns for small and medium-sized businesses (SMEs) using AdWords, Google’s text-based advertisements. In 2008, its inaugural year, over 8,000 students and 300 instructors from 47 countries representing over 200 schools participated. The Challenge ran in undergraduate and graduate classes in disciplines such as marketing, tourism, advertising, communication and information systems. Combining advertising and education, the Challenge gives student hands-on experience in the increasingly important field of online marketing, engages them with local businesses and motivates them through the thrill of a global competition. Student teams receive US$200 in AdWords credits, Google’s premier advertising product that offers cost-per-click advertisements. The teams then recruit and work with a local business to devise an effective online marketing campaign. Students first outline a strategy, run a series of campaigns, and provide their business with recommendations to improve their online marketing. Teams submit two written reports for judging by 14 academics in eight countries. In addition, Google AdWords experts judge teams on their campaign statistics such as success metrics and account management. Rather than a marketing simulation against a computer or hypothetical marketing plans for hypothetical businesses, the Challenges has student teams develop and manage real online advertising campaigns for their clients and compete against peers globally.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recommender Systems is one of the effective tools to deal with information overload issue. Similar with the explicit rating and other implicit rating behaviours such as purchase behaviour, click streams, and browsing history etc., the tagging information implies user’s important personal interests and preferences information, which can be used to recommend personalized items to users. This paper is to explore how to utilize tagging information to do personalized recommendations. Based on the distinctive three dimensional relationships among users, tags and items, a new user profiling and similarity measure method is proposed. The experiments suggest that the proposed approach is better than the traditional collaborative filtering recommender systems using only rating data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Buffer overflow vulnerabilities continue to prevail and the sophistication of attacks targeting these vulnerabilities is continuously increasing. As a successful attack of this type has the potential to completely compromise the integrity of the targeted host, early detection is vital. This thesis examines generic approaches for detecting executable payload attacks, without prior knowledge of the implementation of the attack, in such a way that new and previously unseen attacks are detectable. Executable payloads are analysed in detail for attacks targeting the Linux and Windows operating systems executing on an Intel IA-32 architecture. The execution flow of attack payloads are analysed and a generic model of execution is examined. A novel classification scheme for executable attack payloads is presented which allows for characterisation of executable payloads and facilitates vulnerability and threat assessments, and intrusion detection capability assessments for intrusion detection systems. An intrusion detection capability assessment may be utilised to determine whether or not a deployed system is able to detect a specific attack and to identify requirements for intrusion detection functionality for the development of new detection methods. Two novel detection methods are presented capable of detecting new and previously unseen executable attack payloads. The detection methods are capable of identifying and enumerating the executable payload’s interactions with the operating system on the targeted host at the time of compromise. The detection methods are further validated using real world data including executable payload attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research used the Queensland Police Service, Australia, as a major case study. Information on principles, techniques and processes used, and the reason for the recording, storing and release of audit information for evidentiary purposes is reported. It is shown that Law Enforcement Agencies have a two-fold interest in, and legal obligation pertaining to, audit trails. The first interest relates to the situation where audit trails are actually used by criminals in the commission of crime and the second to where audit trails are generated by the information systems used by the police themselves in support of the recording and investigation of crime. Eleven court cases involving Queensland Police Service audit trails used in evidence in Queensland courts were selected for further analysis. It is shown that, of the cases studied, none of the evidence presented was rejected or seriously challenged from a technical perspective. These results were further analysed and related to normal requirements for trusted maintenance of audit trail information in sensitive environments with discussion on the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented. Managerial and technical frameworks for firstly what is considered as an environment where a computer system may be considered to be operating “properly” and, secondly, what aspects of education, training, qualifications, expertise and the like may be considered as appropriate for persons responsible within that environment, are both proposed. Analysis was undertaken to determine if audit and control of information in a high security environment, such as law enforcement, could be judged as having improved, or not, in the transition from manual to electronic processes. Information collection, control of processing and audit in manual processes used by the Queensland Police Service, Australia, in the period 1940 to 1980 was assessed against current electronic systems essentially introduced to policing in the decades of the 1980s and 1990s. Results show that electronic systems do provide for faster communications with centrally controlled and updated information readily available for use by large numbers of users who are connected across significant geographical locations. However, it is clearly evident that the price paid for this is a lack of ability and/or reluctance to provide improved audit and control processes. To compare the information systems audit and control arrangements of the Queensland Police Service with other government departments or agencies, an Australia wide survey was conducted. Results of the survey were contrasted with the particular results of a survey, conducted by the Australian Commonwealth Privacy Commission four years previous, to this survey which showed that security in relation to the recording of activity against access to information held on Australian government computer systems has been poor and a cause for concern. However, within this four year period there is evidence to suggest that government organisations are increasingly more inclined to generate audit trails. An attack on the overall security of audit trails in computer operating systems was initiated to further investigate findings reported in relation to the government systems survey. The survey showed that information systems audit trails in Microsoft Corporation's “Windows” operating system environments are relied on quite heavily. An audit of the security for audit trails generated, stored and managed in the Microsoft “Windows 2000” operating system environment was undertaken and compared and contrasted with similar such audit trail schemes in the “UNIX” and “Linux” operating systems. Strength of passwords and exploitation of any security problems in access control were targeted using software tools that are freely available in the public domain. Results showed that such security for the “Windows 2000” system is seriously flawed and the integrity of audit trails stored within these environments cannot be relied upon. An attempt to produce a framework and set of guidelines for use by expert witnesses in the information technology (IT) profession is proposed. This is achieved by examining the current rules and guidelines related to the provision of expert evidence in a court environment, by analysing the rationale for the separation of distinct disciplines and corresponding bodies of knowledge used by the Medical Profession and Forensic Science and then by analysing the bodies of knowledge within the discipline of IT itself. It is demonstrated that the accepted processes and procedures relevant to expert witnessing in a court environment are transferable to the IT sector. However, unlike some discipline areas, this analysis has clearly identified two distinct aspects of the matter which appear particularly relevant to IT. These two areas are; expertise gained through the application of IT to information needs in a particular public or private enterprise; and expertise gained through accepted and verifiable education, training and experience in fundamental IT products and system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Paesaggio ed infrastrutture viarie sono un binomio molto forte: il primo ha insito il concetto di accessibilità, in quanto non può esistere senza la presenza di un osservatore; la strada, invece, trova i fattori che la connotano nel suo rapporto con la morfologia su cui insiste. Le infrastrutture viarie sono elemento strutturale e strutturante non solo di un territorio, ma anche di un paesaggio. Le attuali esigenze di mobilità portano oggi a ripensare ed adeguare molte infrastrutture viarie: laddove è possibile si potenziano le strutture esistenti, in diversi casi si ricorre a nuovi tracciati o a varianti di percorso. Porsi il problema di conservare itinerari testimoni della cultura materiale ed economica di una società implica considerazioni articolate, che travalicano i limiti del sedime: una via è un organismo più complesso della semplice linea di trasporto in quanto implica tutta una serie di manufatti a supporto della mobilità e soprattutto il corridoio infrastrutturale che genera e caratterizza, ovvero una porzione variabile di territorio definita sia dal tracciato che dalla morfologia del contesto. L’evoluzione dei modelli produttivi ed economici, che oggi porta quote sempre maggiori di popolazione a passare un tempo sempre minore all’interno del proprio alloggio, rende la riflessione sulle infrastrutture viarie dismesse o declassate occasione per la progettazione di spazi per l’abitare collettivo inseriti in contesti paesaggistici, tanto urbani che rurali, tramite reti di percorsi pensate per assorbire tagli di mobilità specifici e peculiari. Partendo da queste riflessioni la Tesi si articola in: Individuazioni del contesto teorico e pratico: Lo studio mette in evidenza come la questione delle infrastrutture viarie e del loro rapporto con il paesaggio implichi riflessioni incrociate a diversi livelli e tramite diverse discipline. La definizione dello spazio fisico della strada passa infatti per la costruzione di un itinerario, un viaggio che si appoggia tanto ad elementi fisici quanto simbolici. La via è un organismo complesso che travalica il proprio sedime per coinvolgere una porzione ampia di territorio, un corridoio variabile ed articolato in funzione del paesaggio attraversato. Lo studio propone diverse chiavi di lettura, mettendo in luce le possibili declinazioni del tema, in funzione del taglio modale, del rapporto con il contesto, del regime giuridico, delle implicazioni urbanistiche e sociali. La mobilità dolce viene individuata quale possibile modalità di riuso, tutela e recupero, del patrimonio diffuso costituito dalle diversi reti di viabilità. Antologia di casi studio: Il corpo principale dello studio si basa sulla raccolta, analisi e studio dello stato dell’arte nel settore; gli esempi raccolti sono presentati in due sezioni: la prima dedicata alle esperienze più significative ed articolate, che affrontano il recupero delle infrastrutture viarie a più livelli ed in modo avanzato non concentrandosi solo sulla conversione del sedime, ma proponendo un progetto che coinvolga tutto il corridoio attraversato dall’infrastruttura; la seconda parte illustra la pratica corrente nelle diverse realtà nazionali, ponendo in evidenza similitudini e differenze tra i vari approcci.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forces of demand and supply are changing the dynamics of the higher education market. Transformation of institutions of higher learning into competitive enterprise is underway. Higher education institutions are seemingly under intense pressure to create value and focus their efforts and scarce funds on activities that drive up value for their respective customers and other stakeholders. Porter’s generic ‘value chain’ model for creating value requires that the activities of an organization be segregated in to discrete components for value chain analysis to be performed. Recent trends in higher education make such segregation possible. Therefore, it is proposed that the academic process can be unbundled into discrete components which have well developed measures. A reconfigured value chain for higher education, with its own value drivers and critical internal linkages is also proposed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, an enriched radial point interpolation method (e-RPIM) is developed the for the determination of crack tip fields. In e-RPIM, the conventional RBF interpolation is novelly augmented by the suitable trigonometric basis functions to reflect the properties of stresses for the crack tip fields. The performance of the enriched RBF meshfree shape functions is firstly investigated to fit different surfaces. The surface fitting results have proven that, comparing with the conventional RBF shape function, the enriched RBF shape function has: (1) a similar accuracy to fit a polynomial surface; (2) a much better accuracy to fit a trigonometric surface; and (3) a similar interpolation stability without increase of the condition number of the RBF interpolation matrix. Therefore, it has proven that the enriched RBF shape function will not only possess all advantages of the conventional RBF shape function, but also can accurately reflect the properties of stresses for the crack tip fields. The system of equations for the crack analysis is then derived based on the enriched RBF meshfree shape function and the meshfree weak-form. Several problems of linear fracture mechanics are simulated using this newlydeveloped e-RPIM method. It has demonstrated that the present e-RPIM is very accurate and stable, and it has a good potential to develop a practical simulation tool for fracture mechanics problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a multiscale study using the coupled Meshless technique/Molecular Dynamics (M2) for exploring the deformation mechanism of mono-crystalline metal (focus on copper) under uniaxial tension. In M2, an advanced transition algorithm using transition particles is employed to ensure the compatibility of both displacements and their gradients, and an effective local quasi-continuum approach is also applied to obtain the equivalent continuum strain energy density based on the atomistic poentials and Cauchy-Born rule. The key parameters used in M2 are firstly investigated using a benchmark problem. Then M2 is applied to the multiscale simulation for a mono-crystalline copper bar. It has found that the mono-crystalline copper has very good elongation property, and the ultimate strength and Young's modulus are much higher than those obtained in macro-scale.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A bioassay technique, based on surface-enhanced Raman scattering (SERS) tagged gold nanoparticles encapsulated with a biotin functionalised polymer, has been demonstrated through the spectroscopic detection of a streptavidin binding event. A methodical series of steps preceded these results: synthesis of nanoparticles which were found to give a reproducible SERS signal; design and synthesis of polymers with RAFT-functional end groups able to encapsulate the gold nanoparticle. The polymer also enabled the attachment of a biotin molecule functionalised so that it could be attached to the hybrid nanoparticle through a modular process. Finally, the demonstrations of a positive bioassay for this model construct using streptavidin/biotin binding. The synthesis of silver and gold nanoparticles was performed by using tri-sodium citrate as the reducing agent. The shape of the silver nanoparticles was quite difficult to control. Gold nanoparticles were able to be prepared in more regular shapes (spherical) and therefore gave a more consistent and reproducible SERS signal. The synthesis of gold nanoparticles with a diameter of 30 nm was the most reproducible and these were also stable over the longest periods of time. From the SERS results the optimal size of gold nanoparticles was found to be approximately 30 nm. Obtaining a consistent SERS signal with nanoparticles smaller than this was particularly difficult. Nanoparticles more than 50 nm in diameter were too large to remain suspended for longer than a day or two and formed a precipitate, rendering the solutions useless for our desired application. Gold nanoparticles dispersed in water were able to be stabilised by the addition of as-synthesised polymers dissolved in a water miscible solvent. Polymer stabilised AuNPs could not be formed from polymers synthesised by conventional free radical polymerization, i.e. polymers that did not possess a sulphur containing end-group. This indicated that the sulphur-containing functionality present within the polymers was essential for the self assembly process to occur. Polymer stabilization of the gold colloid was evidenced by a range of techniques including, visible spectroscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, thermogravimetric analysis and Raman spectroscopy. After treatment of the hybrid nanoparticles with a series of SERS tags, focussing on 2-quinolinethiol the SERS signals were found to have comparable signal intensity to the citrate stabilised gold nanoparticles. This finding illustrates that the stabilization process does not interfere with the ability of gold nanoparticles to act as substrates for the SERS effect. Incorporation of a biotin moiety into the hybrid nanoparticles was achieved through a =click‘ reaction between an alkyne-functionalised polymer and an azido-functionalised biotin analogue. This functionalized biotin was prepared through a 4-step synthesis from biotin. Upon exposure of the surface-bound streptavidin to biotin-functionalised polymer hybrid gold nanoparticles, then washing, a SERS signal was obtained from the 2-quinolinethiol which was attached to the gold nanoparticles (positive assay). After exposure to functionalised polymer hybrid gold nanoparticles without biotin present then washing a SERS signal was not obtained as the nanoparticles did not bind to the streptavidin (negative assay). These results illustrate the applicability of the use of SERS active functional-polymer encapsulated gold nanoparticles for bioassay application.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work is to develop a Demand-Side-Response (DSR) model, which assists electricity end-users to be engaged in mitigating peak demands on the electricity network in Eastern and Southern Australia. The proposed innovative model will comprise a technical set-up of a programmable internet relay, a router, solid state switches in addition to the suitable software to control electricity demand at user's premises. The software on appropriate multimedia tool (CD Rom) will be curtailing/shifting electric loads to the most appropriate time of the day following the implemented economic model, which is designed to be maximizing financial benefits to electricity consumers. Additionally the model is targeting a national electrical load be spread-out evenly throughout the year in order to satisfy best economic performance for electricity generation, transmission and distribution. The model is applicable in region managed by the Australian Energy Management Operator (AEMO) covering states of Eastern-, Southern-Australia and Tasmania.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stimulated by the efficacy of copper (I) catalysed Huisgen-type 1,3-dipolar cycloaddition of terminal alkynes and organic azides to generate 1,4-disubstituted 1,2,3-triazole derivatives, the importance of ‘click’ chemistry in the synthesis of organic and biological molecular systems is ever increasing.[1] The mild reaction conditions have also led to this reaction gaining favour in the construction of interlocked molecular architectures.[2-4] In the majority of cases however, the triazole group simply serves as a covalent linkage with no function in the resulting organic molecular framework. More recently a renewed interest has been shown in the transition metal coordination chemistry of triazole ligands.[3, 5, 6] In addition novel aryl macrocyclic and acyclic triazole based oligomers have been shown to recognise halide anions via cooperative triazole C5-H….anion hydrogen bonds.[7] In light of this it is surprising the potential anion binding affinity of the positively charged triazolium motif has not, with one notable exception,[8] been investigated. With the objective of manipulating the unique topological cavities of mechanically bonded molecules for anion recognition purposes, we have developed general methods of using anions to template the formation of interpenetrated and interlocked structures.[9-13] Herein we report the first examples of exploiting the 1,2,3-triazolium group in the anion templated formation of pseudorotaxane and rotaxane assemblies. In an unprecedented discovery the bromide anion is shown to be a superior templating reagent to chloride in the synthesis of a novel triazolium axle containing [2]rotaxane. Furthermore the resulting rotaxane interlocked host system exhibits the rare selectivity preference for bromide over chloride...