617 resultados para Performance Effectiveness
Resumo:
The high level of scholarly writing required for a doctoral thesis is a challenge for many research students. However, formal academic writing training is not a core component of many doctoral programs. Informal writing groups for doctoral students may be one method of contributing to the improvement of scholarly writing. In this paper, we report on a writing group that was initiated by an experienced writer and higher degree research supervisor to support and improve her doctoral students’ writing capabilities. Over time, this group developed a workable model to suit their varying needs and circumstances. The model comprised group sessions, an email group, and individual writing. Here, we use a narrative approach to explore the effectiveness and value of our research writing group model in improving scholarly writing. The data consisted of doctoral students’ reflections to stimulus questions about their writing progress and experiences. The stimulus questions sought to probe individual concerns about their own writing, what they had learned in the research writing group, the benefits of the group, and the disadvantages and challenges to participation. These reflections were analysed using thematic analysis. Following this analysis, the supervisor provided her perspective on the key themes that emerged. Results revealed that, through the writing group, members learned technical elements (e.g., paragraph structure), non-technical elements (e.g., working within limited timeframes), conceptual elements (e.g., constructing a cohesive arguments), collaborative writing processes, and how to edit and respond to feedback. In addition to improved writing quality, other benefits were opportunities for shared writing experiences, peer support, and increased confidence and motivation. The writing group provides a unique social learning environment with opportunities for: professional dialogue about writing, peer learning and review, and developing a supportive peer network. Thus our research writing group has proved an effective avenue for building doctoral students’ capability in scholarly writing. The proposed model for a research writing group could be applicable to any context, regardless of the type and location of the university, university faculty, doctoral program structure, or number of postgraduate students. It could also be used within a group of students with diverse research abilities, needs, topics and methodologies. However, it requires a group facilitator with sufficient expertise in scholarly writing and experience in doctoral supervision who can both engage the group in planned writing activities and also capitalise on fruitful lines of discussion related to students’ concerns as they arise. The research writing group is not intended to replace traditional supervision processes nor existing training. However it has clear benefits for improving scholarly writing in doctoral research programs particularly in an era of rapidly increasing student load.
Resumo:
PPPs are held to be a powerful way of mobilising private finance and resources to deliver public infrastructure. Theoretically, research into procurement has begun to acknowledge difficulties with the classification and assessment of different types of procurement, particularly those which do not sufficiently acknowledge variety within specific types of procurement methods. This paper advances a theoretical framework based on an evolutionary economic conceptualisation of a routine, which can accommodate the variety evident in procurement projects, in particular PPPs. The paper tests how the various elements of a PPP, as advanced in the theoretical framework, affect performance across 10 case studies. It concludes, that a limited number of elements of a PPP affect their performance, and provides strong evidence for the theoretical model advanced in this paper.
Resumo:
When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.
Resumo:
This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.
Resumo:
The molecular and metal profile fingerprints were obtained from a complex substance, Atractylis chinensis DC—a traditional Chinese medicine (TCM), with the use of the high performance liquid chromatography (HPLC) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) techniques. This substance was used in this work as an example of a complex biological material, which has found application as a TCM. Such TCM samples are traditionally processed by the Bran, Cut, Fried and Swill methods, and were collected from five provinces in China. The data matrices obtained from the two types of analysis produced two principal component biplots, which showed that the HPLC fingerprint data were discriminated on the basis of the methods for processing the raw TCM, while the metal analysis grouped according to the geographical origin. When the two data matrices were combined into a one two-way matrix, the resulting biplot showed a clear separation on the basis of the HPLC fingerprints. Importantly, within each different grouping the objects separated according to their geographical origin, and they ranked approximately in the same order in each group. This result suggested that by using such an approach, it is possible to derive improved characterisation of the complex TCM materials on the basis of the two kinds of analytical data. In addition, two supervised pattern recognition methods, K-nearest neighbors (KNNs) method, and linear discriminant analysis (LDA), were successfully applied to the individual data matrices—thus, supporting the PCA approach.
Resumo:
Intelligent software agents are promising in improving the effectiveness of e-marketplaces for e-commerce. Although a large amount of research has been conducted to develop negotiation protocols and mechanisms for e-marketplaces, existing negotiation mechanisms are weak in dealing with complex and dynamic negotiation spaces often found in e-commerce. This paper illustrates a novel knowledge discovery method and a probabilistic negotiation decision making mechanism to improve the performance of negotiation agents. Our preliminary experiments show that the probabilistic negotiation agents empowered by knowledge discovery mechanisms are more effective and efficient than the Pareto optimal negotiation agents in simulated e-marketplaces.
Resumo:
Purpose – The purpose of this paper is to examine the use of bid information, including both price and non-price factors in predicting the bidder’s performance. Design/methodology/approach – The practice of the industry was first reviewed. Data on bid evaluation and performance records of the successful bids were then obtained from the Hong Kong Housing Department, the largest housing provider in Hong Kong. This was followed by the development of a radial basis function (RBF) neural network based performance prediction model. Findings – It is found that public clients are more conscientious and include non-price factors in their bid evaluation equations. With the input variables used the information is available at the time of the bid and the output variable is the project performance score recorded during work in progress achieved by the successful bidder. It was found that past project performance score is the most sensitive input variable in predicting future performance. Research limitations/implications – The paper shows the inadequacy of using price alone for bid award criterion. The need for a systemic performance evaluation is also highlighted, as this information is highly instrumental for subsequent bid evaluations. The caveat for this study is that the prediction model was developed based on data obtained from one single source. Originality/value – The value of the paper is in the use of an RBF neural network as the prediction tool because it can model non-linear function. This capability avoids tedious ‘‘trial and error’’ in deciding the number of hidden layers to be used in the network model. Keywords Hong Kong, Construction industry, Neural nets, Modelling, Bid offer spreads Paper type Research paper
Resumo:
As part of a Doctor of Business Administration degree programme jointly run by Curtin University, Perth, Australia and Lingnan University, Hong Kong, a research thesis relating organizational effectiveness to the organizational culture of Hong Kong construction firms involved in public housing is being undertaken. Organizational effectiveness is measured by the Housing Department (HD) Performance Assessment Scoring System (PASS) and organizational culture traits and strengths have been measured by using the Denison Organizational Culture Survey (OCS), developed by Daniel Denison and William S. Neale and based on 16 years of research involving over 1,000 organizations. The PASS scores of building contractors are compared with the OCS scores to determine if there is any significant correlation between highly effective companies and particular organizational strengths and traits. Profiles are then drawn using the Denison Model and can be compared against ‘norms’ for the industry sector on which the survey has been carried out. The next stage of the work is to present the results of the survey to individual companies, conduct focus group interviews to test the results, discover more detail on that company’s culture and discuss possible actions based on the results. It is in this latter stage that certain value management techniques may well prove very useful.
Resumo:
Successful project delivery of construction projects depends on many factors. With regard to the construction of a facility, selecting a competent contractor for the job is paramount. As such, various approaches have been advanced to facilitate tender award decisions. Essentially, this type of decision involves the prediction of a bidderÕs performance based on information available at the tender stage. A neural network based prediction model was developed and presented in this paper. Project data for the study were obtained from the Hong Kong Housing Department. Information from the tender reports was used as input variables and performance records of the successful bidder during construction were used as output variables. It was found that the networks for the prediction of performance scores for Works gave the highest hit rate. In addition, the two most sensitive input variables toward such prediction are ‘‘Difference between Estimate’’ and ‘‘Difference between the next closest bid’’. Both input variables are price related, thus suggesting the importance of tender sufficiency for the assurance of quality production.
Resumo:
Aims: To assess the effectiveness of current treatment approaches to assist benzodiazepine discontinuation. Methods: A systematic review of approaches to benzodiazepine discontinuation in general practice and out-patient settings was undertaken. Routine care was compared with three treatment approaches: brief interventions, gradual dose reduction (GDR) and psychological interventions. GDR was compared with GDR plus psychological interventions or substitutive pharmacotherapies. Results: Inclusion criteria were met by 24 studies, and a further eight were identified by future search. GDR [odds ratio (OR) = 5.96, confidence interval (CI) = 2.08–17.11] and brief interventions (OR = 4.37, CI = 2.28–8.40) provided superior cessation rates at post-treatment to routine care. Psychological treatment plus GDR were superior to both routine care (OR = 3.38, CI = 1.86–6.12) and GDR alone (OR = 1.82, CI = 1.25–2.67). However, substitutive pharmacotherapies did not add to the impact of GDR (OR = 1.30, CI = 0.97– 1.73), and abrupt substitution of benzodiazepines by other pharmacotherapy was less effective than GDR alone (OR = 0.30, CI = 0.14–0.64). Few studies on any technique had significantly greater benzodiazepine discontinuation than controls at follow-up. Conclusions: Providing an intervention is more effective than routine care. Psychological interventions may improve discontinuation above GDR alone. While some substitutive pharmacotherapies may have promise, current evidence is insufficient to support their use.
Resumo:
It is a big challenge to clearly identify the boundary between positive and negative streams. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on RCV1, and substantial experiments show that the proposed approach achieves encouraging performance.
Resumo:
This paper investigates self–Googling through the monitoring of search engine activities of users and adds to the few quantitative studies on this topic already in existence. We explore this phenomenon by answering the following questions: To what extent is the self–Googling visible in the usage of search engines; is any significant difference measurable between queries related to self–Googling and generic search queries; to what extent do self–Googling search requests match the selected personalised Web pages? To address these questions we explore the theory of narcissism in order to help define self–Googling and present the results from a 14–month online experiment using Google search engine usage data.
Resumo:
An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).
Resumo:
Censorship and Performance, edited by Tom Sellar, examines the politics of censorship, and continuing contests over the ‘right’ to claim theatrical and cultural stages for controversial forms of social and self representation, at the start of the twenty-first century. In bringing this collection together, Sellar has taken a broad-based approach to the concept of censorship in theatrical performance—and, indeed, to the concept of theatrical performance itself. Sellar and his contributors clearly accept that surveillance, suppression and restriction of specific forms of representation is a complex, culturally specific phenomenon. In this sense, Censorship and Performance addresses direct political control over content, as well as thornier arguments about media controversy, moral panic, and the politics of self-censorship amongst artists and arts organisations.
Resumo:
This study investigated the effects of visual status, driver age and the presence of secondary distracter tasks on driving performance. Twenty young (M = 26.8 years) and 19 old (M = 70.2 years) participants drove around a closed-road circuit under three visual (normal, simulated cataracts, blur) and three distracter conditions (none, visual, auditory). Simulated visual impairment, increased driver age and the presence of a distracter task detrimentally affected all measures of driving performance except gap judgments and lane keeping. Significant interaction effects were evident between visual status, age and distracters; simulated cataracts had the most negative impact on performance in the presence of visual distracters and a more negative impact for older drivers. The implications of these findings for driving behaviour and acquisition of driving-related information for people with common visual impairments are discussed