905 resultados para Incidental capture
Resumo:
As computer applications become more available—both technically and economically—construction project managers are increasingly able to access advanced computer tools capable of transforming the role that project managers have typically performed. Competence at using these tools requires a dual commitment in training—from the individual and the firm. Improving the computer skills of project managers can provide construction firms with a competitive advantage to differentiate from others in an increasingly competitive international market. Yet, few published studies have quantified what existing level of competence construction project managers have. Identification of project managers’ existing computer application skills is a necessary first step to developing more directed training to better capture the benefits of computer applications. This paper discusses the yet to be released results of a series of surveys undertaken in Malaysia, Singapore, Indonesia, Australia and the United States through QUT’s School of Construction Management and Property and the M.E. Rinker, Sr. School of Building Construction at the University of Florida. This international survey reviews the use and reported competence in using a series of commercially-available computer applications by construction project managers. The five different country locations of the survey allow cross-national comparisons to be made between project managers undertaking continuing professional development programs. The results highlight a shortfall in the ability of construction project managers to capture potential benefits provided by advanced computer applications and provide directions for targeted industry training programs. This international survey also provides a unique insight to the cross-national usage of advanced computer applications and forms an important step in this ongoing joint review of technology and the construction project manager.
Resumo:
This paper presents a comprehensive review of scientific and grey literature on gross pollutant traps (GPTs). GPTs are designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. Their application involves professional societies, research organisations, local city councils, government agencies and the stormwater industry—often in partnership. In view of this, the 113 references include unpublished manuscripts from these bodies along with scientific peer-reviewed conference papers and journal articles. The literature reviewed was organised into a matrix of six main devices and nine research areas (testing methodologies) which include: design appraisal study, field monitoring/testing, experimental flow fields, gross pollutant capture/retention characteristics, residence time calculations, hydraulic head loss, screen blockages, flow visualisations and computational fluid dynamics (CFD). When the fifty-four item matrix was analysed, twenty-eight research gaps were found in the tabulated literature. It was also found that the number of research gaps increased if only the scientific literature was considered. It is hoped, that in addition to informing the research community at QUT, this literature review will also be of use to other researchers in this field.
Resumo:
This article examines the problem of patent ambush in standard setting, where patent owners are sometimes able to capture industry standards in order to secure monopoly power and windfall profits. Because standardisation generally introduces high switching costs, patent ambush can impose significant costs on downstream manufacturers and consumers and drastically reduce the efficiency gains of standardisation.This article considers how Australian competition law is likely to apply to patent ambush both in the development of a standard (through misrepresenting the existence of an essential patent) and after a standard is implemented (through refusing to license an essential patented technology either at all or on reasonable and non-discriminatory (RAND) terms). This article suggests that non-disclosure of patent interests is unlikely to restrained by Part IV of the Trade Practices Act (TPA), and refusals to license are only likely to be restrained if the refusal involves leveraging or exclusive dealing. By contrast, Standard Setting Organisations (SSOs) which seek to limit this behaviour through private ordering may face considerable scrutiny under the new cartel provisions of the TPA. This article concludes that SSOs may be best advised to implement administrative measures to prevent patent hold-up, such as reviewing which patents are essential for the implementation of a standard, asking patent holders to make their licence conditions public to promote transparency, and establishing forums where patent licensees can complain about licence terms that they consider to be unreasonable or discriminatory. Additionally, the ACCC may play a role in authorising SSO policies that could otherwise breach the new cartel provisions, but which have the practical effect of promoting competition in the standards setting environment.
Resumo:
Genomic and proteomic analyses have attracted a great deal of interests in biological research in recent years. Many methods have been applied to discover useful information contained in the enormous databases of genomic sequences and amino acid sequences. The results of these investigations inspire further research in biological fields in return. These biological sequences, which may be considered as multiscale sequences, have some specific features which need further efforts to characterise using more refined methods. This project aims to study some of these biological challenges with multiscale analysis methods and stochastic modelling approach. The first part of the thesis aims to cluster some unknown proteins, and classify their families as well as their structural classes. A development in proteomic analysis is concerned with the determination of protein functions. The first step in this development is to classify proteins and predict their families. This motives us to study some unknown proteins from specific families, and to cluster them into families and structural classes. We select a large number of proteins from the same families or superfamilies, and link them to simulate some unknown large proteins from these families. We use multifractal analysis and the wavelet method to capture the characteristics of these linked proteins. The simulation results show that the method is valid for the classification of large proteins. The second part of the thesis aims to explore the relationship of proteins based on a layered comparison with their components. Many methods are based on homology of proteins because the resemblance at the protein sequence level normally indicates the similarity of functions and structures. However, some proteins may have similar functions with low sequential identity. We consider protein sequences at detail level to investigate the problem of comparison of proteins. The comparison is based on the empirical mode decomposition (EMD), and protein sequences are detected with the intrinsic mode functions. A measure of similarity is introduced with a new cross-correlation formula. The similarity results show that the EMD is useful for detection of functional relationships of proteins. The third part of the thesis aims to investigate the transcriptional regulatory network of yeast cell cycle via stochastic differential equations. As the investigation of genome-wide gene expressions has become a focus in genomic analysis, researchers have tried to understand the mechanisms of the yeast genome for many years. How cells control gene expressions still needs further investigation. We use a stochastic differential equation to model the expression profile of a target gene. We modify the model with a Gaussian membership function. For each target gene, a transcriptional rate is obtained, and the estimated transcriptional rate is also calculated with the information from five possible transcriptional regulators. Some regulators of these target genes are verified with the related references. With these results, we construct a transcriptional regulatory network for the genes from the yeast Saccharomyces cerevisiae. The construction of transcriptional regulatory network is useful for detecting more mechanisms of the yeast cell cycle.
Resumo:
This paper considers the implications of journalism research being located within the Field of Research associated with the creative arts and writing in the recent Excellence in Research for Australia (ERA) evaluations. While noting that this classification does capture a significant trajectory in Australian journalism research, it also points to some anomalous implications of understanding journalism as an arts discipline, given its historical co-location in universities with communications disciplines, and the mutually reinforcing relationships between the two fields.
Resumo:
Explanations of the role of analogies in learning science at a cognitive level are made in terms of creating bridges between new information and students’ prior knowledge. In this empirical study of learning with analogies in an 11th grade chemistry class, we explore an alternative explanation at the "social" level where analogy shapes classroom discourse. Students in the study developed analogies within small groups and with their teacher. These classroom interactions were monitored to identify changes in discourse that took place through these activities. Beginning from socio-cultural perspectives and hybridity, we investigated classroom discourse during analogical activities. From our analyses, we theorized a merged discourse that explains how the analog discourse becomes intertwined with the target discourse generating a transitional state where meanings, signs, symbols, and practices are in flux. Three categories were developed that capture how students intertwined the analog and target discourses—merged words, merged utterances/sentences, and merged practices.
Resumo:
The practices of marketeers in the Queensland property market have been the subject of intense media interest and have caused widespread consumer concern. In response to these concerns the Queensland government has amended the Property Agents and Motor Dealers Act 2000 (Qld) (“the Act”). Significant changes to the Act were introduced by the Property Agents and Motor Dealers Amendment Act 2001 (Qld) (“the amending Act”). Implicit in the introduction of the amending Act was recognition that marketeers had altered their operating tactics to avoid the requirements of the Act. The amendments enhance regulation and are intended to capture the conduct of all persons involved in unconscionable practices that have lead to dysfunction in certain sectors of the Queensland property market. The amending Act is focussed on a broad regulatory response rather than further regulation of specific occupations in the property sale process as it was recognised that the approach of industry regulation had proven to be inadequate to curtail marketeering practices and to protect the interests of consumers. As well as providing for increased disclosure obligations on real estate agents, property developers and lawyers together with an extension of the 5 business day cooling-off period to all contracts (other than auction contracts) for the sale of residential property in Queensland; in an endeavour to further protect consumer interests the amending Act provides for increased jurisdiction and powers to the Property Agents and Motor Dealers Tribunal (“the Tribunal”) enabling the Tribunal to deal with claims against marketeers. These provisions commenced on the date of assent (21 September 2001). The aim of this article is to examine the circumstances in which marketeers will contravene the legislation and the ramifications.
Resumo:
Background In order to provide insights into the complex biochemical processes inside a cell, modelling approaches must find a balance between achieving an adequate representation of the physical phenomena and keeping the associated computational cost within reasonable limits. This issue is particularly stressed when spatial inhomogeneities have a significant effect on system's behaviour. In such cases, a spatially-resolved stochastic method can better portray the biological reality, but the corresponding computer simulations can in turn be prohibitively expensive. Results We present a method that incorporates spatial information by means of tailored, probability distributed time-delays. These distributions can be directly obtained by single in silico or a suitable set of in vitro experiments and are subsequently fed into a delay stochastic simulation algorithm (DSSA), achieving a good compromise between computational costs and a much more accurate representation of spatial processes such as molecular diffusion and translocation between cell compartments. Additionally, we present a novel alternative approach based on delay differential equations (DDE) that can be used in scenarios of high molecular concentrations and low noise propagation. Conclusions Our proposed methodologies accurately capture and incorporate certain spatial processes into temporal stochastic and deterministic simulations, increasing their accuracy at low computational costs. This is of particular importance given that time spans of cellular processes are generally larger (possibly by several orders of magnitude) than those achievable by current spatially-resolved stochastic simulators. Hence, our methodology allows users to explore cellular scenarios under the effects of diffusion and stochasticity in time spans that were, until now, simply unfeasible. Our methodologies are supported by theoretical considerations on the different modelling regimes, i.e. spatial vs. delay-temporal, as indicated by the corresponding Master Equations and presented elsewhere.
Resumo:
Aggressive driving is increasingly a concern for drivers in highly motorised countries. However, the role of driver intent in this behaviour is problematic and there is little research on driver cognitions in relation to aggressive driving incidents. In addition, while drivers who admit to behaving aggressively on the road also frequently report being recipients of similar behaviours, little is known about the relationship between perpetration and victimisation or about how road incidents escalate into the more serious events that feature in capture media attention. The current study used qualitative interviews to explore driver cognitions and underlying motivations for aggressive behaviours on the road. A total of 30 drivers aged 18-49 years were interviewed about their experiences with aggressive driving. A key theme identified in responses was driver aggression as an attempt to manage or modify the behaviour of other road users. Two subthemes were identified and appeared related to separate motivations for aggressive responses: ‘teaching them a lesson’ referred to situations where respondents intended to convey criticism or disapproval, usually of unintended behaviours by the other driver, and thus encourage self-correction; and ‘justified retaliation’ which referred to situations where respondents perceived deliberate intent on the part of the other driver and responded aggressively in return. Mildly aggressive driver behaviour appears to be common. Moreover such behaviour has a sufficiently negative impact on other drivers that it may be worth addressing because of its potential for triggering retaliation in kind or escalation of aggression, thus compromising safety.
Resumo:
To date, consumer behaviour research is still over-focused on the functional rather than the dysfunctional. Both empirical and anecdotal evidence suggest that service organisations are burdened with the concept of consumer sovereignty, while consumers freely flout the ‘rules’ of social exchange and behave in deviant and dysfunctional ways. Further, the current scope of consumer misbehaviour research suggests that the phenomenon has principally been studied in the context of economically-focused exchange. This limits our current understanding of consumer misbehaviour to service encounters that are more transactional than relational in nature. Consequently, this thesis takes a Social Exchange approach to consumer misbehaviour and reports a three-stage multi-method study that examined the nature and antecedents of consumer misbehaviour in professional services. It addresses the following broad research question: What is the nature of consumer misbehaviour during professional service encounters? Study One initially explored the nature of consumer misbehaviour in professional service encounters using critical incident technique (CIT) within 38 semi-structured in-depth interviews. The study was designed to develop a better understanding of what constitutes consumer misbehaviour from a service provider’s perspective. Once the nature of consumer misbehaviour had been qualified, Study Two focused on developing and refining calibrated items that formed Guttman-like scales for two consumer misbehaviour constructs: one for the most theoretically-central type of consumer misbehaviour identified in Study One (i.e. refusal to participate) and one for the most well-theorised and salient type of consumer misbehaviour (i.e. verbal abuse) identified in Study One to afford a comparison. This study used Rasch modelling to investigate whether it was possible to calibrate the escalating severity of a series of decontextualised behavioural descriptors in a valid and reliable manner. Creating scales of calibrated items that capture the variation in severity of different types of consumer misbehaviour identified in Study One allowed for a more valid and reliable investigation of the antecedents of such behaviour. Lastly, Study Three utilised an experimental design to investigate three key antecedents of consumer misbehaviour: (1) the perceived quality of the service encounter [drawn from Fullerton and Punj’s (1993) model of aberrant consumer behaviour], (2) the violation of consumers’ perceptions of justice and equity [drawn from Rousseau’s (1989) Psychological Contract Theory], and (3) consumers’ affective responses to exchange [drawn from Weiss and Cropanzano’s (1996) Affective Events Theory]. Investigating three key antecedents of consumer misbehaviour confirmed the newly-developed understanding of the nature of consumer misbehaviour during professional service encounters. Combined, the results of the three studies suggest that consumer misbehaviour is characteristically different within professional services. The most salient and theoretically-central behaviours can be measured using increasingly severe decontextualised behavioural descriptors. Further, increasingly severe forms of consumer misbehaviour are likely to occur as a response to consumer anger at low levels of interpersonal service quality. These findings have a range of key implications for both marketing theory and practice.
Resumo:
Awareness of the power of the mass media to communicate images of protest to global audiences and, in so doing, to capture space in global media discourses is a central feature of the transnational protest movement. A number of protest movements have formed around opposition to concepts and practices that operate beyond national borders, such as neoliberal globalization or threats to the environment. However, transnational protests also involve more geographically discreet issues such as claims to national independence or greater religious or political freedom by groups within specific national contexts. Appealing to the international community for support is a familiar strategy for communities who feel that they are being discriminated against or ignored by a national government.
Resumo:
Aim. This paper elucidates the nature of metaphor and the conditions necessary to its use as an analytic device in qualitative research, and describes how the use of metaphor assisted in the analytic processes of a grounded theory study of nephrology nursing expertise. Background. The use of metaphor is pervasive in everyday thought, language and action. It is an important means for the comprehension and management of everyday life, and makes challenging or problematic concepts easier to explain. Metaphors are also pervasive in quantitative and qualitative research for the same reason. In both everyday life and in research, their use may be implicit or explicit. Methods. The study using grounded theory methodology took place in one renal unit in New South Wales, Australia between 1999 and 2000 and included six non-expert and 11 expert nurses. It involved simultaneous data collection and analysis using participant observation, semi-structured interviews and review of nursing documentation. Findings. A three stage skills-acquisitive process was identified in which an orchestral metaphor was used to explain the relationships between stages and to satisfactorily capture the data coded within each stage. Conclusion. Metaphors create images, clarify and add depth to meanings and, if used appropriately and explicitly in qualitative research, can capture data at highly conceptual levels. Metaphors also assist in explaining the relationship between findings in a clear and coherent manner. © 2005 Blackwell Publishing Ltd.
Resumo:
The number of software vendors offering ‘Software-as-a-Service’ has been increasing in recent years. In the Software-as-a-Service model software is operated by the software vendor and delivered to the customer as a service. Existing business models and industry structures are challenged by the changes to the deployment and pricing model compared to traditional software. However, the full implications on the way companies create, deliver and capture value are not yet sufficiently analyzed. Current research is scattered on specific aspects, only a few studies provide a more holistic view of the impact from a business model perspective. For vendors it is, however, crucial to be aware of the potentially far reaching consequences of Software-as-a-Service. Therefore, a literature review and three exploratory case studies of leading software vendors are used to evaluate possible implications of Software-as-a-Service on business models. The results show an impact on all business model building blocks and highlight in particular the often less articulated impact on key activities, customer relationship and key partnerships for leading software vendors and show related challenges, for example, with regard to the integration of development and operations processes. The observed implications demonstrate the disruptive character of the concept and identify future research requirements.
Resumo:
This paper presents a series of ongoing experiments to facilitate serendipity in the design studio through a diversity of delivery modes. These experiments are conducted in a second year architectural design studio, and include physical, dramatic and musical performance. The act of designing is always exploratory, always seeking an unknown resolution, and the ability to see and capture the value in the unexpected is a critical aspect of such creative design practice. Engaging with the unexpected is however a difficult ability to develop in students. Just how can a student be schooled in such abilities when the challenge and the context are unforeseeable? How can students be offered meaningful feedback about an issue that cannot be predicted, when feedback comes in the form of extrinsic assessment from a tutor? This project establishes a number of student activities that seek to provide intrinsic feedback from the activity itself. Further to this, the project seeks to heighten student engagement with the project through physical expression and performance: utilising more of the students’ senses than just vision and hearing. Diana Laurillard’s theories of conversational frameworks (2002) are used to interrogate the act of dramatic performance as an act of learning, with particular reference to the serendipitous activities of design. Such interrogation highlights the feedback mechanisms that facilitate intrinsic feedback and fast, if not instantaneous, cycles of learning. The physical act of performance itself provides a learning experience that is not replicable in other modes of delivery. Student feedback data and independent assessment of project outcomes are used to assess the success of this studio model.
Resumo:
Most web service discovery systems use keyword-based search algorithms and, although partially successful, sometimes fail to satisfy some users information needs. This has given rise to several semantics-based approaches that look to go beyond simple attribute matching and try to capture the semantics of services. However, the results reported in the literature vary and in many cases are worse than the results obtained by keyword-based systems. We believe the accuracy of the mechanisms used to extract tokens from the non-natural language sections of WSDL files directly affects the performance of these techniques, because some of them can be more sensitive to noise. In this paper three existing tokenization algorithms are evaluated and a new algorithm that outperforms all the algorithms found in the literature is introduced.