420 resultados para Optimal fusion performance
Resumo:
PPPs are held to be a powerful way of mobilising private finance and resources to deliver public infrastructure. Theoretically, research into procurement has begun to acknowledge difficulties with the classification and assessment of different types of procurement, particularly those which do not sufficiently acknowledge variety within specific types of procurement methods. This paper advances a theoretical framework based on an evolutionary economic conceptualisation of a routine, which can accommodate the variety evident in procurement projects, in particular PPPs. The paper tests how the various elements of a PPP, as advanced in the theoretical framework, affect performance across 10 case studies. It concludes, that a limited number of elements of a PPP affect their performance, and provides strong evidence for the theoretical model advanced in this paper.
Resumo:
When complex projects go wrong they can go horribly wrong with severe financial consequences. We are undertaking research to develop leading performance indicators for complex projects, metrics to provide early warning of potential difficulties. The assessment of success of complex projects can be made by a range of stakeholders over different time scales, against different levels of project results: the project’s outputs at the end of the project; the project’s outcomes in the months following project completion; and the project’s impact in the years following completion. We aim to identify leading performance indicators, which may include both success criteria and success factors, and which can be measured by the project team during project delivery to forecast success as assessed by key stakeholders in the days, months and years following the project. The hope is the leading performance indicators will act as alarm bells to show if a project is diverting from plan so early corrective action can be taken. It may be that different combinations of the leading performance indicators will be appropriate depending on the nature of project complexity. In this paper we develop a new model of project success, whereby success is assessed by different stakeholders over different time frames against different levels of project results. We then relate this to measurements that can be taken during project delivery. A methodology is described to evaluate the early parts of this model. Its implications and limitations are described. This paper describes work in progress.
Resumo:
This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.
Resumo:
The molecular and metal profile fingerprints were obtained from a complex substance, Atractylis chinensis DC—a traditional Chinese medicine (TCM), with the use of the high performance liquid chromatography (HPLC) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) techniques. This substance was used in this work as an example of a complex biological material, which has found application as a TCM. Such TCM samples are traditionally processed by the Bran, Cut, Fried and Swill methods, and were collected from five provinces in China. The data matrices obtained from the two types of analysis produced two principal component biplots, which showed that the HPLC fingerprint data were discriminated on the basis of the methods for processing the raw TCM, while the metal analysis grouped according to the geographical origin. When the two data matrices were combined into a one two-way matrix, the resulting biplot showed a clear separation on the basis of the HPLC fingerprints. Importantly, within each different grouping the objects separated according to their geographical origin, and they ranked approximately in the same order in each group. This result suggested that by using such an approach, it is possible to derive improved characterisation of the complex TCM materials on the basis of the two kinds of analytical data. In addition, two supervised pattern recognition methods, K-nearest neighbors (KNNs) method, and linear discriminant analysis (LDA), were successfully applied to the individual data matrices—thus, supporting the PCA approach.
Resumo:
Cognitive-energetical theories of information processing were used to generate predictions regarding the relationship between workload and fatigue within and across consecutive days of work. Repeated measures were taken on board a naval vessel during a non-routine and a routine patrol. Data were analyzed using growth curve modeling. Fatigue demonstrated a non-monotonic relationship within days in both patrols – fatigue was high at midnight, started decreasing until noontime and then increased again. Fatigue increased across days towards the end of the non-routine patrol, but remained stable across days in the routine patrol. The relationship between workload and fatigue changed over consecutive days in the non-routine patrol. At the beginning of the patrol, low workload was associated with fatigue. At the end of the patrol, high workload was associated with fatigue. This relationship could not be tested in the routine patrol, however it demonstrated a non-monotonic relationship between workload and fatigue – low and high workloads were associated with the highest fatigue. These results suggest that the optimal level of workload can change over time and thus have implications for the management of fatigue.
Resumo:
This study explores whether the relation between internal audit quality and firm performance is associated with firm characteristics of information asymmetry and uncertainty (growth opportunities) and certain governance controls (audit committee effectiveness). The results from this preliminary study of 60 Malaysian companies show that the association between internal audit quality and firm performance is stronger for firms with high growth opportunities and that this positive association is weakened by increasing audit committee independence. These findings demonstrate the internal auditors conflicting roles and question the governance recommendations that require all members of the audit committee to be non-executive directors.
Resumo:
Intelligent software agents are promising in improving the effectiveness of e-marketplaces for e-commerce. Although a large amount of research has been conducted to develop negotiation protocols and mechanisms for e-marketplaces, existing negotiation mechanisms are weak in dealing with complex and dynamic negotiation spaces often found in e-commerce. This paper illustrates a novel knowledge discovery method and a probabilistic negotiation decision making mechanism to improve the performance of negotiation agents. Our preliminary experiments show that the probabilistic negotiation agents empowered by knowledge discovery mechanisms are more effective and efficient than the Pareto optimal negotiation agents in simulated e-marketplaces.
Resumo:
To allocate and size capacitors in a distribution system, an optimization algorithm, called Discrete Particle Swarm Optimization (DPSO), is employed in this paper. The objective is to minimize the transmission line loss cost plus capacitors cost. During the optimization procedure, the bus voltage, the feeder current and the reactive power flowing back to the source side should be maintained within standard levels. To validate the proposed method, the semi-urban distribution system that is connected to bus 2 of the Roy Billinton Test System (RBTS) is used. This 37-bus distribution system has 22 loads being located in the secondary side of a distribution substation (33/11 kV). Reducing the transmission line loss in a standard system, in which the transmission line loss consists of only about 6.6 percent of total power, the capabilities of the proposed technique are seen to be validated.
Resumo:
Purpose – The purpose of this paper is to examine the use of bid information, including both price and non-price factors in predicting the bidder’s performance. Design/methodology/approach – The practice of the industry was first reviewed. Data on bid evaluation and performance records of the successful bids were then obtained from the Hong Kong Housing Department, the largest housing provider in Hong Kong. This was followed by the development of a radial basis function (RBF) neural network based performance prediction model. Findings – It is found that public clients are more conscientious and include non-price factors in their bid evaluation equations. With the input variables used the information is available at the time of the bid and the output variable is the project performance score recorded during work in progress achieved by the successful bidder. It was found that past project performance score is the most sensitive input variable in predicting future performance. Research limitations/implications – The paper shows the inadequacy of using price alone for bid award criterion. The need for a systemic performance evaluation is also highlighted, as this information is highly instrumental for subsequent bid evaluations. The caveat for this study is that the prediction model was developed based on data obtained from one single source. Originality/value – The value of the paper is in the use of an RBF neural network as the prediction tool because it can model non-linear function. This capability avoids tedious ‘‘trial and error’’ in deciding the number of hidden layers to be used in the network model. Keywords Hong Kong, Construction industry, Neural nets, Modelling, Bid offer spreads Paper type Research paper
Resumo:
The paper charts the history and development of the Hong Kong Housing Department (HKHD) Performance Assessment Scoring System (PASS) from 1990 to the present day and examines its effect on facilitating change to the quality of construction work of building contractors engaged in the production of public sector housing projects Hong Kong. The paper builds partly on empirical research carried out by the author as part of a doctoral thesis from 2000 to 2005, on experiential knowledge and also on some relevant case studies. The outcomes from this earlier research and validation of PASS based on results derived from the system since the research was originally undertaken are of benefit to practitioners and academics working and studying in the areas of performance assessment and organisational culture and change. The conclusions presented in the paper further underpin the connection established in previous research between strong organisational culture and project and corporate success. Organisational culture was measured using a survey instrument originally developed by Denison and Neale (1994), adapted for the environment of the study, and corporate success was measured by the PASS system mentioned above. The major results of the original study indicate that there is significant linkage between strong organisational cultures and business success and the detailed findings were that, (1) strong organisational culture was positively associated a high level of company effectiveness, (2) a high level of company effectiveness was positively associated with the cultural traits of ‘consistency’, ‘adaptability’ and ‘mission’, and (3) a high level of company effectiveness was positively associated with the combined cultural traits represented by the dimensions of ‘external focus’ and ‘stable culture’. Several opportunities to take forward this research have been identified, including extending the study to other countries and also longitudinally re-evaluating some of the original case studies to ascertain how organisational cultures have changed or further developed in relation to the changing construction climate in Hong Kong.
Resumo:
Successful project delivery of construction projects depends on many factors. With regard to the construction of a facility, selecting a competent contractor for the job is paramount. As such, various approaches have been advanced to facilitate tender award decisions. Essentially, this type of decision involves the prediction of a bidderÕs performance based on information available at the tender stage. A neural network based prediction model was developed and presented in this paper. Project data for the study were obtained from the Hong Kong Housing Department. Information from the tender reports was used as input variables and performance records of the successful bidder during construction were used as output variables. It was found that the networks for the prediction of performance scores for Works gave the highest hit rate. In addition, the two most sensitive input variables toward such prediction are ‘‘Difference between Estimate’’ and ‘‘Difference between the next closest bid’’. Both input variables are price related, thus suggesting the importance of tender sufficiency for the assurance of quality production.
Theoretical and numerical investigation of plasmon nanofocusing in metallic tapered rods and grooves
Resumo:
Effective focusing of electromagnetic (EM) energy to nanoscale regions is one of the major challenges in nano-photonics and plasmonics. The strong localization of the optical energy into regions much smaller than allowed by the diffraction limit, also called nanofocusing, offers promising applications in nano-sensor technology, nanofabrication, near-field optics or spectroscopy. One of the most promising solutions to the problem of efficient nanofocusing is related to surface plasmon propagation in metallic structures. Metallic tapered rods, commonly used as probes in near field microscopy and spectroscopy, are of a particular interest. They can provide very strong EM field enhancement at the tip due to surface plasmons (SP’s) propagating towards the tip of the tapered metal rod. A large number of studies have been devoted to the manufacturing process of tapered rods or tapered fibers coated by a metal film. On the other hand, structures such as metallic V-grooves or metal wedges can also provide strong electric field enhancements but manufacturing of these structures is still a challenge. It has been shown, however, that the attainable electric field enhancement at the apex in the V-groove is higher than at the tip of a metal tapered rod when the dissipation level in the metal is strong. Metallic V-grooves also have very promising characteristics as plasmonic waveguides. This thesis will present a thorough theoretical and numerical investigation of nanofocusing during plasmon propagation along a metal tapered rod and into a metallic V-groove. Optimal structural parameters including optimal taper angle, taper length and shape of the taper are determined in order to achieve maximum field enhancement factors at the tip of the nanofocusing structure. An analytical investigation of plasmon nanofocusing by metal tapered rods is carried out by means of the geometric optics approximation (GOA), which is also called adiabatic nanofocusing. However, GOA is applicable only for analysing tapered structures with small taper angles and without considering a terminating tip structure in order to neglect reflections. Rigorous numerical methods are employed for analysing non-adiabatic nanofocusing, by tapered rod and V-grooves with larger taper angles and with a rounded tip. These structures cannot be studied by analytical methods due to the presence of reflected waves from the taper section, the tip and also from (artificial) computational boundaries. A new method is introduced to combine the advantages of GOA and rigorous numerical methods in order to reduce significantly the use of computational resources and yet achieve accurate results for the analysis of large tapered structures, within reasonable calculation time. Detailed comparison between GOA and rigorous numerical methods will be carried out in order to find the critical taper angle of the tapered structures at which GOA is still applicable. It will be demonstrated that optimal taper angles, at which maximum field enhancements occur, coincide with the critical angles, at which GOA is still applicable. It will be shown that the applicability of GOA can be substantially expanded to include structures which could be analysed previously by numerical methods only. The influence of the rounded tip, the taper angle and the role of dissipation onto the plasmon field distribution along the tapered rod and near the tip will be analysed analytically and numerically in detail. It will be demonstrated that electric field enhancement factors of up to ~ 2500 within nanoscale regions are predicted. These are sufficient, for instance, to detect single molecules using surface enhanced Raman spectroscopy (SERS) with the tip of a tapered rod, an approach also known as tip enhanced Raman spectroscopy or TERS. The results obtained in this project will be important for applications for which strong local field enhancement factors are crucial for the performance of devices such as near field microscopes or spectroscopy. The optimal design of nanofocusing structures, at which the delivery of electromagnetic energy to the nanometer region is most efficient, will lead to new applications in near field sensors, near field measuring technology, or generation of nanometer sized energy sources. This includes: applications in tip enhanced Raman spectroscopy (TERS); manipulation of nanoparticles and molecules; efficient coupling of optical energy into and out of plasmonic circuits; second harmonic generation in non-linear optics; or delivery of energy to quantum dots, for instance, for quantum computations.
Resumo:
Surveillance networks are typically monitored by a few people, viewing several monitors displaying the camera feeds. It is then very difficult for a human operator to effectively detect events as they happen. Recently, computer vision research has begun to address ways to automatically process some of this data, to assist human operators. Object tracking, event recognition, crowd analysis and human identification at a distance are being pursued as a means to aid human operators and improve the security of areas such as transport hubs. The task of object tracking is key to the effective use of more advanced technologies. To recognize an event people and objects must be tracked. Tracking also enhances the performance of tasks such as crowd analysis or human identification. Before an object can be tracked, it must be detected. Motion segmentation techniques, widely employed in tracking systems, produce a binary image in which objects can be located. However, these techniques are prone to errors caused by shadows and lighting changes. Detection routines often fail, either due to erroneous motion caused by noise and lighting effects, or due to the detection routines being unable to split occluded regions into their component objects. Particle filters can be used as a self contained tracking system, and make it unnecessary for the task of detection to be carried out separately except for an initial (often manual) detection to initialise the filter. Particle filters use one or more extracted features to evaluate the likelihood of an object existing at a given point each frame. Such systems however do not easily allow for multiple objects to be tracked robustly, and do not explicitly maintain the identity of tracked objects. This dissertation investigates improvements to the performance of object tracking algorithms through improved motion segmentation and the use of a particle filter. A novel hybrid motion segmentation / optical flow algorithm, capable of simultaneously extracting multiple layers of foreground and optical flow in surveillance video frames is proposed. The algorithm is shown to perform well in the presence of adverse lighting conditions, and the optical flow is capable of extracting a moving object. The proposed algorithm is integrated within a tracking system and evaluated using the ETISEO (Evaluation du Traitement et de lInterpretation de Sequences vidEO - Evaluation for video understanding) database, and significant improvement in detection and tracking performance is demonstrated when compared to a baseline system. A Scalable Condensation Filter (SCF), a particle filter designed to work within an existing tracking system, is also developed. The creation and deletion of modes and maintenance of identity is handled by the underlying tracking system; and the tracking system is able to benefit from the improved performance in uncertain conditions arising from occlusion and noise provided by a particle filter. The system is evaluated using the ETISEO database. The dissertation then investigates fusion schemes for multi-spectral tracking systems. Four fusion schemes for combining a thermal and visual colour modality are evaluated using the OTCBVS (Object Tracking and Classification in and Beyond the Visible Spectrum) database. It is shown that a middle fusion scheme yields the best results and demonstrates a significant improvement in performance when compared to a system using either mode individually. Findings from the thesis contribute to improve the performance of semi-automated video processing and therefore improve security in areas under surveillance.
Resumo:
This paper investigates self–Googling through the monitoring of search engine activities of users and adds to the few quantitative studies on this topic already in existence. We explore this phenomenon by answering the following questions: To what extent is the self–Googling visible in the usage of search engines; is any significant difference measurable between queries related to self–Googling and generic search queries; to what extent do self–Googling search requests match the selected personalised Web pages? To address these questions we explore the theory of narcissism in order to help define self–Googling and present the results from a 14–month online experiment using Google search engine usage data.
Resumo:
Censorship and Performance, edited by Tom Sellar, examines the politics of censorship, and continuing contests over the ‘right’ to claim theatrical and cultural stages for controversial forms of social and self representation, at the start of the twenty-first century. In bringing this collection together, Sellar has taken a broad-based approach to the concept of censorship in theatrical performance—and, indeed, to the concept of theatrical performance itself. Sellar and his contributors clearly accept that surveillance, suppression and restriction of specific forms of representation is a complex, culturally specific phenomenon. In this sense, Censorship and Performance addresses direct political control over content, as well as thornier arguments about media controversy, moral panic, and the politics of self-censorship amongst artists and arts organisations.