13 resultados para Brackins, Phil
em University of Queensland eSpace - Australia
Resumo:
Flexible transport services (FTS) have been of increasing interest in developed countries as a bridge between the use of personal car travel and fixed route transit services. This paper reports on findings from a recent study in Queensland Australia, which identified lessons from an international review and implications for Australia. Potential strategic directions, including a vision, mission, key result areas, strategies, and identified means of measuring performance are described. Evaluation criteria for assessing flexible transport proposals were developed, and approaches to identifying and assessing needs and demands outlined. The use of emerging technologies is also a key element of successful flexible transport services.
Resumo:
A major challenge in successfully implementing transit-oriented development (TOD) is having a robust process that ensures effective appraisal, initiation and delivery of multi-stakeholder TOD projects. A step-by step project development process can assist in the methodic design, evaluation, and initiation of TOD projects. Successful TOD requires attention to transit, mixed-use development and public space. Brisbane, Australia provides a case-study where recent planning policies and infrastructure documents have laid a foundation for TOD, but where barriers lie in precinct level planning and project implementation. In this context and perhaps in others, the research effort needs to shift toward identification of appropriate project processes and strategies. This paper presents the outcomes of research conducted to date. Drawing on the mainstream approach to project development and financial evaluation for property projects, key steps for potential use in successful delivery of TOD projects have been identified, including: establish the framework; location selection; precinct context review; preliminary precinct design; the initial financial viability study; the decision stage; establishment of project structure; land acquisition; development application; and project delivery. The appropriateness of this mainstream development and appraisal process will be tested through stakeholder research, and the proposed process will then be refined for adoption in TOD projects. It is suggested that the criteria for successful TOD should be broadened beyond financial concerns in order to deliver public sector support for project initiation.
Resumo:
Every day trillions of dollars circulate the globe in a digital data space and new forms of property and ownership emerge. Massive corporate entities with a global reach are formed and disappear with breathtaking speed, making and breaking personal fortunes the size of which defy imagination. Fictitious commodities abound. The genomes of entire nations have become corporately owned. Relationships have become the overt basis of economic wealth and political power. Hypercapitalism explores the problems of understanding this emergent form of global political economic organization by focusing on the internal relations between language, new media networks, and social perceptions of value. Taking an historical approach informed by Marx, Phil Graham draws upon writings in political economy, media studies, sociolinguistics, anthropology, and critical social science to understand the development, roots, and trajectory of the global system in which every possible aspect of human existence, including imagined futures, has become a commodity form.
Resumo:
According to Hugh Mellor in Real Time II (1998, Ch. 12), assuming the logical independence of causal facts and the 'law of large numbers', causal loops are impossible because if they were possible they would produce inconsistent sets of frequencies. I clarify the argument, and argue that it would be preferable to abandon the relevant independence assumption in the case of causal loops.
Resumo:
In this paper I consider two objections raised by Nick Smith (1997) to an argument against the probability of time travel given by Paul Horwich (1995, 1987). Horwich argues that time travel leads to inexplicable and improbable coincidences. I argue that one of Smith's objections fails, but that another is correct. I also consider an instructive way to defend Horwich's argument against the second of Smith's objections, but show that it too fails. I conclude that unless there is something faulty in the conception of explanation implicit in Horwich's argument, time travel presents us with nothing that is inexplicable.
Resumo:
Incremental parsing has long been recognized as a technique of great utility in the construction of language-based editors, and correspondingly, the area currently enjoys a mature theory. Unfortunately, many practical considerations have been largely overlooked in previously published algorithms. Many user requirements for an editing system necessarily impact on the design of its incremental parser, but most approaches focus only on one: response time. This paper details an incremental parser based on LR parsing techniques and designed for use in a modeless syntax recognition editor. The nature of this editor places significant demands on the structure and quality of the document representation it uses, and hence, on the parser. The strategy presented here is novel in that both the parser and the representation it constructs are tolerant of the inevitable and frequent syntax errors that arise during editing. This is achieved by a method that differs from conventional error repair techniques, and that is more appropriate for use in an interactive context. Furthermore, the parser aims to minimize disturbance to this representation, not only to ensure other system components can operate incrementally, but also to avoid unfortunate consequences for certain user-oriented services. The algorithm is augmented with a limited form of predictive tree-building, and a technique is presented for the determination of valid symbols for menu-based insertion. Copyright (C) 2001 John Wiley & Sons, Ltd.
Resumo:
High molecular weight mucins represent a unique challenge as tumor markers by virtue of their complex array of epitopes, The list is dominated by the high molecular weight mucins MUC1, CEA and CA125. While the currently accepted role for these tumor markers is in the prediction and detection of relapse, it is possible that their sensitivity and specificity can be improved. Although immunoassays detecting the tumor marker MUC1 are both sensitive and specific for predicting relapse in breast cancer, so far they are not in widespread use in the follow-up of this disease. Are there new combinations of conventional reagents that could improve assay sensitivity, or should we be looking for more radical changes in assay design incorporating combinatorial technology? Copyright (C) 2001 S. Karger AG, Basel.
Resumo:
Using examples from contempoary policy and business discourses, and exemplary historical texts dealing with the notion of value, I put forward an argument as to why a critical scholarship that draws on media history, language analysis, philosophy and political economy is necessary to understand the dynamics of what is being called 'the global knowledge economy'. I argue that the social changes associated with new modes of value determination are closely associated with new media form.
Resumo:
Sensitivity of output of a linear operator to its input can be quantified in various ways. In Control Theory, the input is usually interpreted as disturbance and the output is to be minimized in some sense. In stochastic worst-case design settings, the disturbance is considered random with imprecisely known probability distribution. The prior set of probability measures can be chosen so as to quantify how far the disturbance deviates from the white-noise hypothesis of Linear Quadratic Gaussian control. Such deviation can be measured by the minimal Kullback-Leibler informational divergence from the Gaussian distributions with zero mean and scalar covariance matrices. The resulting anisotropy functional is defined for finite power random vectors. Originally, anisotropy was introduced for directionally generic random vectors as the relative entropy of the normalized vector with respect to the uniform distribution on the unit sphere. The associated a-anisotropic norm of a matrix is then its maximum root mean square or average energy gain with respect to finite power or directionally generic inputs whose anisotropy is bounded above by a≥0. We give a systematic comparison of the anisotropy functionals and the associated norms. These are considered for unboundedly growing fragments of homogeneous Gaussian random fields on multidimensional integer lattice to yield mean anisotropy. Correspondingly, the anisotropic norms of finite matrices are extended to bounded linear translation invariant operators over such fields.