65 resultados para cache-based mechanism

em Queensland University of Technology - ePrints Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

According to a study conducted by the International Maritime organisation (IMO) shipping sector is responsible for 3.3% of the global Greenhouse Gas (GHG) emissions. The 1997 Kyoto Protocol calls upon states to pursue limitation or reduction of emissions of GHG from marine bunker fuels working through the IMO. In 2011, 14 years after the adoption of the Kyoto Protocol, the Marine Environment Protection Committee (MEPC) of the IMO has adopted mandatory energy efficiency measures for international shipping which can be treated as the first ever mandatory global GHG reduction instrument for an international industry. The MEPC approved an amendment of Annex VI of the 1973 International Convention for the Prevention of Pollution from Ships (MARPOL 73/78) to introduce a mandatory Energy Efficiency Design Index (EEDI) for new ships and the Ship Energy Efficiency Management Plan (SEEMP) for all ships. Considering the growth projections of human population and world trade the technical and operational measures may not be able to reduce the amount of GHG emissions from international shipping in a satisfactory level. Therefore, the IMO is considering to introduce market-based mechanisms that may serve two purposes including providing a fiscal incentive for the maritime industry to invest in more energy efficient manner and off-setting of growing ship emissions. Some leading developing countries already voiced their serious reservations on the newly adopted IMO regulations stating that by imposing the same obligation on all countries, irrespective of their economic status, this amendment has rejected the Principle of Common but Differentiated Responsibility (the CBDR Principle), which has always been the cornerstone of international climate change law discourses. They also claimed that negotiation for a market based mechanism should not be continued without a clear commitment from the developed counters for promotion of technical co-operation and transfer of technology relating to the improvement of energy efficiency of ships. Against this backdrop, this article explores the challenges for the developing counters in the implementation of already adopted technical and operational measures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many software applications extend their functionality by dynamically loading libraries into their allocated address space. However, shared libraries are also often of unknown provenance and quality and may contain accidental bugs or, in some cases, deliberately malicious code. Most sandboxing techniques which address these issues require recompilation of the libraries using custom tool chains, require significant modifications to the libraries, do not retain the benefits of single address-space programming, do not completely isolate guest code, or incur substantial performance overheads. In this paper we present LibVM, a sandboxing architecture for isolating libraries within a host application without requiring any modifications to the shared libraries themselves, while still retaining the benefits of a single address space and also introducing a system call inter-positioning layer that allows complete arbitration over a shared library’s functionality. We show how to utilize contemporary hardware virtualization support towards this end with reasonable performance overheads and, in the absence of such hardware support, our model can also be implemented using a software-based mechanism. We ensure that our implementation conforms as closely as possible to existing shared library manipulation functions, minimizing the amount of effort needed to apply such isolation to existing programs. Our experimental results show that it is easy to gain immediate benefits in scenarios where the goal is to guard the host application against unintentional programming errors when using shared libraries, as well as in more complex scenarios, where a shared library is suspected of being actively hostile. In both cases, no changes are required to the shared libraries themselves.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Potent and specific enzyme inhibition is a key goal in the development of therapeutic inhibitors targeting proteolytic activity. The backbone-cyclized peptide, Sunflower Trypsin Inhibitor (SFTI-1) affords a scaffold that can be engineered to achieve both these aims. SFTI-1's mechanism of inhibition is unusual in that it shows fast-on/slow-off kinetics driven by cleavage and religation of a scissile bond. This phenomenon was used to select a nanomolar inhibitor of kallikrein-related peptidase 7 (KLK7) from a versatile library of SFTI variants with diversity tailored to exploit distinctive surfaces present in the active site of serine proteases. Inhibitor selection was achieved through the use of size exclusion chromatography to separate protease/inhibitor complexes from unbound inhibitors followed by inhibitor identification according to molecular mass ascertained by mass spectrometry. This approach identified a single dominant inhibitor species with molecular weight of 1562.4 Da, which is consistent with the SFTI variant SFTI-WCTF. Once synthesized individually this inhibitor showed an IC50 of 173.9 ± 7.6 nM against chromogenic substrates and could block protein proteolysis. Molecular modeling analysis suggested that selection of SFTI-WCTF was driven by specific aromatic interactions and stabilized by an enhanced internal hydrogen bonding network. This approach provides a robust and rapid route to inhibitor selection and design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Laskowski inhibitors regulate serine proteases by an intriguing mode of action that involves deceiving the protease into synthesizing a peptide bond. Studies exploring naturally occurring Laskowski inhibitors have uncovered several structural features that convey the inhibitor's resistance to hydrolysis and exceptional binding affinity. However, in the context of Laskowski inhibitor engineering, the way that various modifications intended to fine-tune an inhibitor's potency and selectivity impact on its association and dissociation rates remains unclear. This information is important as Laskowski inhibitors are becoming increasingly used as design templates to develop new protease inhibitors for pharmaceutical applications. In this study, we used the cyclic peptide, sunflower trypsin inhibitor-1 (SFTI-1), as a model system to explore how the inhibitor's sequence and structure relate to its binding kinetics and function. Using enzyme assays, MD simulations and NMR spectroscopy to study SFTI variants with diverse sequence and backbone modifications, we show that the geometry of the binding loop mainly influences the inhibitor's potency by modulating the association rate, such that variants lacking a favourable conformation show dramatic losses in activity. Additionally, we show that the inhibitor's sequence (including both the binding loop and its scaffolding) influences its potency and selectivity by modulating both the association and the dissociation rates. These findings provide new insights into protease inhibitor function and design that we apply by engineering novel inhibitors for classical serine proteases, trypsin and chymotrypsin and two kallikrein-related peptidases (KLK5 and KLK14) that are implicated in various cancers and skin diseases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mobile applications are being increasingly deployed on a massive scale in various mobile sensor grid database systems. With limited resources from the mobile devices, how to process the huge number of queries from mobile users with distributed sensor grid databases becomes a critical problem for such mobile systems. While the fundamental semantic cache technique has been investigated for query optimization in sensor grid database systems, the problem is still difficult due to the fact that more realistic multi-dimensional constraints have not been considered in existing methods. To solve the problem, a new semantic cache scheme is presented in this paper for location-dependent data queries in distributed sensor grid database systems. It considers multi-dimensional constraints or factors in a unified cost model architecture, determines the parameters of the cost model in the scheme by using the concept of Nash equilibrium from game theory, and makes semantic cache decisions from the established cost model. The scenarios of three factors of semantic, time and locations are investigated as special cases, which improve existing methods. Experiments are conducted to demonstrate the semantic cache scheme presented in this paper for distributed sensor grid database systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Banana bunchy top is regarded as the most important viral disease of banana, causing significant yield losses worldwide. The disease is caused by Banana bunchy top virus (BBTV), which is a circular ssDNA virus belonging to the genus Babuvirus in the family Nanoviridae. There are currently few effective control strategies for this and other ssDNA viruses. “In Plant Activation” (InPAct) is a novel technology being developed at QUT for ssDNA virus-activated suicide gene expression. The technology exploits the rolling circle replication mechanism of ssDNA viruses and is based on a unique “split” gene design such that suicide gene expression is only activated in the presence of the viral Rep. This PhD project aimed to develop a BBTV-based InPAct system as a suicide gene strategy to control BBTV. The BBTV-based InPAct vector design requires a BBTV intergenic region (IR) to be embedded within an intron in the gene expression cassette. To ensure that the BBTV IR would not interfere with intron splicing, a TEST vector was initially generated that contained the entire BBTV IR embedded within an intron in a β-glucuronidase (GUS) expression vector. Transient GUS assays in banana embryogenic cell suspensions indicated that cryptic intron splice sites were present within the IR. Transcript analysis revealed two cryptic intron splice sites in the Domain III sequence of the CR-M within the IR. Removal of the CR-M from the TEST vector resulted in an enhancement of GUS expression suggesting that the cryptic intron splice sites had been removed. An InPAct GUS vector was subsequently generated that contained the modified BBTV IR, with the CR-M (minus Domain III) repositioned within the InPAct cassette. Using transient histochemical and fluorometric GUS assays in banana embryogenic cells, the InPAct GUS vector was shown to be activated in the presence of the BBTV Rep. However, the presence of both BBTV Rep and Clink was shown to have a deleterious effect on GUS expression suggesting that these proteins were cytotoxic at the levels expressed. Analysis of replication of the InPAct vectors by Southern hybridisation revealed low levels of InPAct cassette-based episomal DNA released from the vector through the nicking/ligation activity of BBTV Rep. However, Rep-mediated episomal replicons, indicative of rolling circle replication of the released circularised cassettes, were not observed. The inability of the InPAct cassette to be replicated was further investigated. To examine whether the absence of Domain III of the CR-M was responsible, a suite of modified BBTV-based InPAct GUS vectors was constructed that contained the CR-M with the inclusion of Domain III, the CR-M with the inclusion of Domain III and additional upstream IR sequence, or no CR-M. Analysis of replication by Southern hybridisation revealed that neither the presence of Domain III, nor the entire CR-M, had an effect on replication levels. Since the InPAct cassette was significantly larger than the native BBTV genomic components (approximately 1 kb), the effect of InPAct cassette size on replication was also investigated. A suite of size variant BBTV-based vectors was constructed that increased the size of a replication competent cassette to 1.1 kbp through to 2.1 kbp.. Analysis of replication by Southern hybridisation revealed that an increase in vector size above approximately 1.5 - 1.7 kbp resulted in a decrease in replication. Following the demonstration of Rep-mediated release, circularisation and expression from the InPAct GUS vector, an InPAct vector was generated in which the uidA reporter gene was replaced with the ribonuclease-encoding suicide gene, barnase. Initially, a TEST vector was generated to assess the cytotoxicity of Barnase on banana cells. Although transient assays revealed a Barnase-induced cytotoxic effect in banana cells, the expression levels were sub-optimal. An InPAct BARNASE vector was generated and tested for BBTV Rep-activated Barnase expression using transient assays in banana embryogenic cells. High levels of background expression from the InPAct BARNASE vector made it difficult to accurately assess Rep-activated Barnase expression. Analysis of replication by Southern hybridisation revealed low levels of InPAct cassette-based episomal DNA released from the vector but no Rep-mediated episomal replicons indicative of rolling circle replication of the released circularised cassettes were again observed. Despite the inability of the InPAct vectors to replicate to enable high level gene expression, the InPAct BARNASE vector was assessed in planta for BBTV Rep-mediated activation of Barnase expression. Eleven lines of transgenic InPAct BARNASE banana plants were generated by Agrobacterium-mediated transformation and were challenged with viruliferous Pentalonia nigronervosa. At least one clonal plant in each line developed bunchy top symptoms and infection was confirmed by PCR. No localised lesions were observed on any plants, nor was there any localised GUS expression in the one InPAct GUS line challenged with viruliferous aphids. The results presented in this thesis are the first study towards the development of a BBTV-based InPAct system as a Rep-activatable suicide gene expression system to control BBTV. Although further optimisation of the vectors is necessary, the preliminary results suggest that this approach has the potential to be an effective control strategy for BBTV. The use of iterons within the InPAct vectors that are recognised by Reps from different ssDNA plant viruses may provide a broad-spectrum resistance strategy against multiple ssDNA plant viruses. Further, this technology holds great promise as a platform technology for the molecular farming of high-value proteins in vitro or in vivo through expression of the ssDNA virus Rep protein.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The aim of this paper is to investigate the ways of best managing city-regions’ valuable tangible and intangible assets while pursuing a knowledge-based urban development that is sustainable and competitive. Design/methodology/approach – The paper provides a theoretical framework to conceptualise a new strategic planning mechanism, knowledge-based strategic planning, which has been emerged as a planning mechanism for the knowledge-based urban development of post-industrial city-regions. Originality/value – The paper develops a planning framework entitled 6K1C for knowledge-based strategic planning to be used in the analysis of city-regions’ tangible and intangible assets. Practical implications – The paper discusses the importance of asset mapping of cityregions, and explores the ways of successfully managing city-regions’ tangible/intangible assets to achieve an urban development that is sustainable and knowledge-based. Keywords – Knowledge-based urban development, Knowledge-based strategic planning, Tangible assets, Intangible assets, City-regions. Paper type – Academic Research Paper

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effective management of bridge stock involves making decisions as to when to repair, remedy, or do nothing, taking into account the financial and service life implications. Such decisions require a reliable diagnosis as to the cause of distress and an understanding of the likely future degradation. Such diagnoses are based on a combination of visual inspections, laboratory tests on samples and expert opinions. In addition, the choice of appropriate laboratory tests requires an understanding of the degradation mechanisms involved. Under these circumstances, the use of expert systems or evaluation tools developed from “realtime” case studies provides a promising solution in the absence of expert knowledge. This paper addresses the issues in bridge infrastructure management in Queensland, Australia. Bridges affected by alkali silica reaction and chloride induced corrosion have been investigated and the results presented using a mind mapping tool. The analysis highights that several levels of rules are required to assess the mechanism causing distress. The systematic development of a rule based approach is presented. An example of this application to a case study bridge has been used to demonstrate that preliminary results are satisfactory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the global knowledge economy, knowledge-intensive industries and knowledge workers are extensively seen as the primary factors to improve the welfare and competitiveness of cities. To attract and retain such industries and workers, cities produce knowledge-based urban development strategies, where such strategising is also an important development mechanism for cities and their economies. This paper investigates Brisbane’s knowledge-based urban development strategies that support generation, attraction, and retention of investment and talent. The paper provides a clear understanding on the policy frameworks, and relevant applications of Brisbane’s knowledge-based urban development experience in becoming a prosperous knowledge city.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information Retrieval is an important albeit imperfect component of information technologies. A problem of insufficient diversity of retrieved documents is one of the primary issues studied in this research. This study shows that this problem leads to a decrease of precision and recall, traditional measures of information retrieval effectiveness. This thesis presents an adaptive IR system based on the theory of adaptive dual control. The aim of the approach is the optimization of retrieval precision after all feedback has been issued. This is done by increasing the diversity of retrieved documents. This study shows that the value of recall reflects this diversity. The Probability Ranking Principle is viewed in the literature as the “bedrock” of current probabilistic Information Retrieval theory. Neither the proposed approach nor other methods of diversification of retrieved documents from the literature conform to this principle. This study shows by counterexample that the Probability Ranking Principle does not in general lead to optimal precision in a search session with feedback (for which it may not have been designed but is actively used). Retrieval precision of the search session should be optimized with a multistage stochastic programming model to accomplish the aim. However, such models are computationally intractable. Therefore, approximate linear multistage stochastic programming models are derived in this study, where the multistage improvement of the probability distribution is modelled using the proposed feedback correctness method. The proposed optimization models are based on several assumptions, starting with the assumption that Information Retrieval is conducted in units of topics. The use of clusters is the primary reasons why a new method of probability estimation is proposed. The adaptive dual control of topic-based IR system was evaluated in a series of experiments conducted on the Reuters, Wikipedia and TREC collections of documents. The Wikipedia experiment revealed that the dual control feedback mechanism improves precision and S-recall when all the underlying assumptions are satisfied. In the TREC experiment, this feedback mechanism was compared to a state-of-the-art adaptive IR system based on BM-25 term weighting and the Rocchio relevance feedback algorithm. The baseline system exhibited better effectiveness than the cluster-based optimization model of ADTIR. The main reason for this was insufficient quality of the generated clusters in the TREC collection that violated the underlying assumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

TCP is a dominant protocol for consistent communication over the internet. It provides flow, congestion and error control mechanisms while using wired reliable networks. Its congestion control mechanism is not suitable for wireless links where data corruption and its lost rate are higher. The physical links are transparent from TCP that takes packet losses due to congestion only and initiates congestion handling mechanisms by reducing transmission speed. This results in wasting already limited available bandwidth on the wireless links. Therefore, there is no use to carry out research on increasing bandwidth of the wireless links until the available bandwidth is not optimally utilized. This paper proposed a hybrid scheme called TCP Detection and Recovery (TCP-DR) to distinguish congestion, corruption and mobility related losses and then instructs the data sending host to take appropriate action. Therefore, the link utilization is optimal while losses are either due to high bit error rate or mobility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce the concept of attribute-based authenticated key exchange (AB-AKE) within the framework of ciphertext policy attribute-based systems. A notion of AKE-security for AB-AKE is presented based on the security models for group key exchange protocols and also taking into account the security requirements generally considered in the ciphertext policy attribute-based setting. We also extend the paradigm of hybrid encryption to the ciphertext policy attribute-based encryption schemes. A new primitive called encapsulation policy attribute-based key encapsulation mechanism (EP-AB-KEM) is introduced and a notion of chosen ciphertext security is de�ned for EP-AB-KEMs. We propose an EP-AB-KEM from an existing attribute-based encryption scheme and show that it achieves chosen ciphertext security in the generic group and random oracle models. We present a generic one-round AB-AKE protocol that satis�es our AKE-security notion. The protocol is generically constructed from any EP-AB-KEM that satis�es chosen ciphertext security. Instantiating the generic AB-AKE protocol with our EP-AB-KEM will result in a concrete one-round AB-AKE protocol also secure in the generic group and random oracle models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study, to elucidate the role of des(1-3)IGF-I in the maturation of IGF-I,used two strategies. The first was to detect the presence of enzymes in tissues, which would act on IGF-I to produce des(1-3)IGF-I, and the second was to detect the potential products of such enzymic activity, namely Gly-Pro-Glu(GPE), Gly-Pro(GP) and des(l- 3)IGF-I. No neutral tripeptidyl peptidase (TPP II), which would release the tripeptide GPE from IGF-I, was detected in brain, urine nor in red or white blood cells. The TPPlike activity which was detected, was attributed to a combined action of a dipeptidyl peptidase (DPP N) and an aminopeptidase (AP A). A true TPP II was, however, detected in platelets. Two purified TPP II enzymes were investigated but they did not release GPE from IGF-I under a variety of conditions. Consequently, TPP II seemed unlikely to participate in the formation of des(1-3)IGF-I. In contrast, an acidic tripeptidyl peptidase activity (TPP I) was detected in brain and colostrum, the former with a pH optimum of 4.5 and the latter 3.8. It seems likely that such an enzyme would participate in the formation of des( 1-3 )IGF-I in these tissues in vitro, ie. that des(1-3)IGF-I may have been produced as an artifact in the isolation of IGF-I from brain and colostrum in acidic conditions. This contrasts with suggestions of an in vivo role for des(1-3)IGF-I, as reported by others. The activity of a dipeptidyl peptidase N (DPP N) from urine, which should release the dipeptide GP from IGF-I, was assessed under a variety of conditions and with a variety of additives and potential enzyme stimulants, but there was no release of GP. The DPP N also exhibited a transferase activity with synthetic substrates in the presence of dipeptides, at lower concentrations than previously reported for other acceptors or other proteolytic enzymes. In addition, a low concentration of a product,possibly the tetrapeptide Gly-Pro-Gly-Leu, was detected with the action of the enzyme on IGF-I in the presence of the dipeptide Gly-Leu. As part of attempts to detect tissue production of des(1-3)IGF-I, a monoclonal antibody (MAb ), directed towards the GPE- end ofiGF-I was produced by immunisation with a 10-mer covalently attached to a carrier protein. By the use of indirect ELISA and inhibitor studies, the MAb was shown to selectively recognise peptides with anNterminal GPE- sequence, and applied to the indirect detection of des(1-3)IGF-I. The concentration of GPE in brain, measured by mass spectrometry ( MS), was low, and the concentration of total IGF-I (measured by ELISA with a commercial polyclonal antibody [P Ab]) was 40 times higher at 50 nmol/kg. This also, was not consistent with the action of a tripeptidyl peptidase in brain that converted all IGF-I to des(1-3)IGF-I plus GPE. Contrasting ELISA results, using the MAb prepared in this study, suggest an even higher concentration of intact IGF-I of 150 nmollkg. This would argue against the presence of any des( 1-3 )IGF-I in brain, but in turn, this indicates either the presence of other substances containing a GPE amino-terminus or other cross reacting epitope. Although the results of the specificity studies reported in Chapter 5 would make this latter possibility seem unlikely, it cannot be completely excluded. No GP was detected in brain by MS. No GPE was detected in colostrum by capillary electrophoresis (CE) but the interference from extraneous substances reduced the detectability of GPE by CE and this approach would require further, prior, purification and concentration steps. A molecule, with a migration time equal to that of the peptide GP, was detected in colostrum by CE, but the concentration (~ 10 11mo/L) was much higher than the IGF-I concentration measured by radio-immunoassay using a PAb (80 nmol/L) or using a Mab (300-400 nmolL). A DPP IV enzyme was detected in colostrum and this could account for the GP, derived from substrates other than IGF-1. Based on the differential results of the two antibody assays, there was no indication of the presence of des(1-3)IGF-I in brain or colostrum. In the absence of any enzyme activity directed towards the amino terminus of IGF-I and the absence any potential products, IGF-I, therefore, does not appear to "mature" via des(1-3)IGF-I in the brain, nor in the neutral colostrum. In spite of these results which indicate the absence of an enzymic attack on IGF-I and the absence of the expected products in tissues, the possibility that the conversion of IGF-I may occur in neutral conditions in limited amounts, cannot be ruled out. It remains possible that in the extracellular environment of the membrane, a complex interaction of IGF-I, binding protein, aminopeptidase(s) and receptor, produces des(1- 3)IGF-I as a transient product which is bound to the receptor and internalised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the global knowledge economy, knowledge-intensive industries and knowledge workers are extensively seen as the primary factors to improve the welfare and competitiveness of cities. To attract and retain such industries and workers, cities produce knowledge-based urban development strategies, and therefore such strategising has become an important development mechanism for cities and their economies. The paper discusses the critical connections between knowledge city foundations and integrated knowledge-based urban development mechanisms in both the local and regional level. In particular, the paper investigates Brisbane’s knowledge-based urban development strategies that support gentrification, attraction, and retention of investment and talent. Furthermore, the paper develops a knowledge-based urban development assessment framework to provide a clearer understanding of the local and regional policy frameworks, and relevant applications of Brisbane’s knowledge-based urban development experience, in becoming a prosperous knowledge city. The paper, with its knowledge-based urban development assessment framework, scrutinises Brisbane’s four development domains in detail: economy; society; institutional; built and natural environments. As part of the discussion of the case study findings, the paper describes the global orientation of Brisbane within the frame of regional and local level knowledge-based urban development strategies performing well. Although several good practices from Brisbane have already been internationally acknowledged, the research reveals that Brisbane is still in the early stages of its knowledge-based urban development implementation. Consequently, the development of a monitoring system for all knowledge-based urban development at all levels is highly crucial in accurately measuring the success and failure of specific knowledge-based urban development policies, and Brisbane’s progress towards a knowledge city transformation.