991 resultados para optimize


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the tender process, contractors often rely on subcontract and supply enquiries to calculate their bid prices. However, this integral part of the bidding process is not empirically articulated in the literature. Over 30 published materials on the tendering process of contractors that talk about enquiries were reviewed and found to be based mainly on experiential knowledge rather than systematic evidence. The empirical research here helps to describe the process of enquiries precisely, improve it in practice, and have some basis to support it in theory. Using a live participant observation case study approach, the whole tender process was shadowed in the offices of two of the top 20 UK civil engineering construction firms. This helped to investigate 15 research questions on how contractors enquire and obtain prices from subcontractors and suppliers. Forty-three subcontract enquiries and 18 supply enquiries were made across two different projects with average value of 7m. An average of 15 subcontract packages and seven supply packages was involved. Thus, two or three subcontractors or suppliers were invited to bid in each package. All enquiries were formulated by the estimator, with occasional involvement of three other personnel. Most subcontract prices were received in an average of 14 working days; and supply prices took five days. The findings show 10 main activities involved in processing enquiries and their durations, as well as wasteful practices associated with enquiries. Contractors should limit their enquiry invitations to a maximum of three per package, and optimize the waiting time for quotations in order to improve cost efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Constructing a building is a long process which can take several years. Most building services products are installed while a building is constructed, but they are not operated until the building is commissioned. The warranty term for the building service systems may cover the time starting from their installation to the end of the warranty period. Prior to the commissioning of the building, the building services systems are protected by warranty although they are not operated. The bum in time for such systems is important when warranty costs is analyzed. In this paper, warranty cost models for products with burn in periods are presented. Two burn in policies are developed to optimize the total mean warranty cost. A special case on the relationship between the failure rates of the product at the dormant state and at the I operating state is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many kernel classifier construction algorithms adopt classification accuracy as performance metrics in model evaluation. Moreover, equal weighting is often applied to each data sample in parameter estimation. These modeling practices often become problematic if the data sets are imbalanced. We present a kernel classifier construction algorithm using orthogonal forward selection (OFS) in order to optimize the model generalization for imbalanced two-class data sets. This kernel classifier identification algorithm is based on a new regularized orthogonal weighted least squares (ROWLS) estimator and the model selection criterion of maximal leave-one-out area under curve (LOO-AUC) of the receiver operating characteristics (ROCs). It is shown that, owing to the orthogonalization procedure, the LOO-AUC can be calculated via an analytic formula based on the new regularized orthogonal weighted least squares parameter estimator, without actually splitting the estimation data set. The proposed algorithm can achieve minimal computational expense via a set of forward recursive updating formula in searching model terms with maximal incremental LOO-AUC value. Numerical examples are used to demonstrate the efficacy of the algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a simple and computationally efficient construction algorithm for two class linear-in-the-parameters classifiers. In order to optimize model generalization, a forward orthogonal selection (OFS) procedure is used for minimizing the leave-one-out (LOO) misclassification rate directly. An analytic formula and a set of forward recursive updating formula of the LOO misclassification rate are developed and applied in the proposed algorithm. Numerical examples are used to demonstrate that the proposed algorithm is an excellent alternative approach to construct sparse two class classifiers in terms of performance and computational efficiency.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Measuring pollinator performance has become increasingly important with emerging needs for risk assessment in conservation and sustainable agriculture that require multi-year and multi-site comparisons across studies. However, comparing pollinator performance across studies is difficult because of the diversity of concepts and disparate methods in use. Our review of the literature shows many unresolved ambiguities. Two different assessment concepts predominate: the first estimates stigmatic pollen deposition and the underlying pollinator behaviour parameters, while the second estimates the pollinator’s contribution to plant reproductive success, for example in terms of seed set. Both concepts include a number of parameters combined in diverse ways and named under a diversity of synonyms and homonyms. However, these concepts are overlapping because pollen deposition success is the most frequently used proxy for assessing the pollinator’s contribution to plant reproductive success. We analyse the diverse concepts and methods in the context of a new proposed conceptual framework with a modular approach based on pollen deposition, visit frequency, and contribution to seed set relative to the plant’s maximum female reproductive potential. A system of equations is proposed to optimize the balance between idealised theoretical concepts and practical operational methods. Our framework permits comparisons over a range of floral phenotypes, and spatial and temporal scales, because scaling up is based on the same fundamental unit of analysis, the single visit.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A generalized or tunable-kernel model is proposed for probability density function estimation based on an orthogonal forward regression procedure. Each stage of the density estimation process determines a tunable kernel, namely, its center vector and diagonal covariance matrix, by minimizing a leave-one-out test criterion. The kernel mixing weights of the constructed sparse density estimate are finally updated using the multiplicative nonnegative quadratic programming algorithm to ensure the nonnegative and unity constraints, and this weight-updating process additionally has the desired ability to further reduce the model size. The proposed tunable-kernel model has advantages, in terms of model generalization capability and model sparsity, over the standard fixed-kernel model that restricts kernel centers to the training data points and employs a single common kernel variance for every kernel. On the other hand, it does not optimize all the model parameters together and thus avoids the problems of high-dimensional ill-conditioned nonlinear optimization associated with the conventional finite mixture model. Several examples are included to demonstrate the ability of the proposed novel tunable-kernel model to effectively construct a very compact density estimate accurately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In financial decision-making, a number of mathematical models have been developed for financial management in construction. However, optimizing both qualitative and quantitative factors and the semi-structured nature of construction finance optimization problems are key challenges in solving construction finance decisions. The selection of funding schemes by a modified construction loan acquisition model is solved by an adaptive genetic algorithm (AGA) approach. The basic objectives of the model are to optimize the loan and to minimize the interest payments for all projects. Multiple projects being undertaken by a medium-size construction firm in Hong Kong were used as a real case study to demonstrate the application of the model to the borrowing decision problems. A compromise monthly borrowing schedule was finally achieved. The results indicate that Small and Medium Enterprise (SME) Loan Guarantee Scheme (SGS) was first identified as the source of external financing. Selection of sources of funding can then be made to avoid the possibility of financial problems in the firm by classifying qualitative factors into external, interactive and internal types and taking additional qualitative factors including sovereignty, credit ability and networking into consideration. Thus a more accurate, objective and reliable borrowing decision can be provided for the decision-maker to analyse the financial options.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There have been various techniques published for optimizing the net present value of tenders by use of discounted cash flow theory and linear programming. These approaches to tendering appear to have been largely ignored by the industry. This paper utilises six case studies of tendering practice in order to establish the reasons for this apparent disregard. Tendering is demonstrated to be a market orientated function with many subjective judgements being made regarding a firm's environment. Detailed consideration of 'internal' factors such as cash flow are therefore judged to be unjustified. Systems theory is then drawn upon and applied to the separate processes of estimating and tendering. Estimating is seen as taking place in a relatively sheltered environment and as such operates as a relatively closed system. Tendering, however, takes place in a changing and dynamic environment and as such must operate as a relatively open system. The use of sophisticated methods to optimize the value of tenders is then identified as being dependent upon the assumption of rationality, which is justified in the case of a relatively closed system (i.e. estimating), but not for a relatively open system (i.e. tendering).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter considers the Multiband Orthogonal Frequency Division Multiplexing (MB- OFDM) modulation and demodulation with the intention to optimize the Ultra-Wideband (UWB) system performance. OFDM is a type of multicarrier modulation and becomes the most important aspect for the MB-OFDM system performance. It is also a low cost digital signal component efficiently using Fast Fourier Transform (FFT) algorithm to implement the multicarrier orthogonality. Within the MB-OFDM approach, the OFDM modulation is employed in each 528 MHz wide band to transmit the data across the different bands while also using the frequency hopping technique across different bands. Each parallel bit stream can be mapped onto one of the OFDM subcarriers. Quadrature Phase Shift Keying (QPSK) and Dual Carrier Modulation (DCM) are currently used as the modulation schemes for MB-OFDM in the ECMA-368 defined UWB radio platform. A dual QPSK soft-demapper is suitable for ECMA-368 that exploits the inherent Time-Domain Spreading (TDS) and guard symbol subcarrier diversity to improve the receiver performance, yet merges decoding operations together to minimize hardware and power requirements. There are several methods to demap the DCM, which are soft bit demapping, Maximum Likelihood (ML) soft bit demapping, and Log Likelihood Ratio (LLR) demapping. The Channel State Information (CSI) aided scheme coupled with the band hopping information is used as a further technique to improve the DCM demapping performance. ECMA-368 offers up to 480 Mb/s instantaneous bit rate to the Medium Access Control (MAC) layer, but depending on radio channel conditions dropped packets unfortunately result in a lower throughput. An alternative high data rate modulation scheme termed Dual Circular 32-QAM that fits within the configuration of the current standard increasing system throughput thus maintaining the high rate throughput even with a moderate level of dropped packets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The widely-adopted protocol for the cryopreservation of winter buds of fruit trees, such as Malus and Pyrus, was developed in a region with a continental climate, that provides relatively hard winters with a consequent effect on adaptive plant hardiness. In this study the protocol was evaluated in a typical maritime climate (eastern Denmark) where milder winters can be expected. The survival over two winters was evaluated, looking at variation between seasons and cultivars together with the progressive reduction in survival due to individual steps in the protocol. The study confirms that under such conditions significant variation in survival can be expected and that an extended period of imposed dehydration at -4oC is critical for bud survival. The occurrence of freezing events during this treatment suggests that cryodehydration may be involved, as well as evaporative water loss. To optimize the protocol for maritime environments, further investigation into the water status of the explants during cryopreservation is proposed. Keywords: Malus x domestica, cryopreservation, dormant bud, survival, grafting

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The distribution of nutrients and assimilates in different organs and tissues is in a constant state of flux throughout the growth and development of a plant. At key stages during the life cycle profound changes occur, and perhaps one of the most critical of these is during seed filling. By restricting the competition for reserves in Arabidopsis plants, the ability to manipulate seed size, seed weight, or seed content has been explored. Removal of secondary inflorescences and lateral branches resulted in a stimulation of elongation of the primary inflorescence and an increase in the distance between siliques. The pruning treatment also led to the development of longer and larger siliques that contained fewer, bigger seeds. This seems to be a consequence of a reduction in the number of ovules that develop and an increase in the fatty acid content of the seeds that mature. The data show that shoot architecture could have a substantial impact on the partitioning of reserves between vegetative and reproductive tissues and could be an important trait for selection in rapid phenotyping screens to optimize crop performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Basic Network transactions specifies that datagram from source to destination is routed through numerous routers and paths depending on the available free and uncongested paths which results in the transmission route being too long, thus incurring greater delay, jitter, congestion and reduced throughput. One of the major problems of packet switched networks is the cell delay variation or jitter. This cell delay variation is due to the queuing delay depending on the applied loading conditions. The effect of delay, jitter accumulation due to the number of nodes along transmission routes and dropped packets adds further complexity to multimedia traffic because there is no guarantee that each traffic stream will be delivered according to its own jitter constraints therefore there is the need to analyze the effects of jitter. IP routers enable a single path for the transmission of all packets. On the other hand, Multi-Protocol Label Switching (MPLS) allows separation of packet forwarding and routing characteristics to enable packets to use the appropriate routes and also optimize and control the behavior of transmission paths. Thus correcting some of the shortfalls associated with IP routing. Therefore MPLS has been utilized in the analysis for effective transmission through the various networks. This paper analyzes the effect of delay, congestion, interference, jitter and packet loss in the transmission of signals from source to destination. In effect the impact of link failures, repair paths in the various physical topologies namely bus, star, mesh and hybrid topologies are all analyzed based on standard network conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The layer-by-layer deposition of polymers onto surfaces allows the fabrication of multilayered materials for a wide range of applications, from drug delivery to biosensors. This work describes the analysis of complex formation between poly(acrylic acid) and methylcellulose in aqueous solutions using Biacore, a surface plasmon resonance analytical technique, traditionally used to examine biological interactions. This technique characterized the layer-by-layer deposition of these polymers on the surface of a Biacore sensor chip. The results were subsequently used to optimize the experimental conditions for sequential layer deposition on glass slides. The role of the solution pH and poly(acrylic acid) molecular weight on the formation of interpolymer multilayered coatings was researched, and showed that the optimal deposition of the polymer complexes was achieved at pHs ≤2.5 with a poly(acrylic acid) molecular weight of 450 kDa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Platinum is one of the most common coatings used to optimize mirror reflectivity in soft X-ray beamlines. Normal operation results in optics contamination by carbon-based molecules present in the residual vacuum of the beamlines. The reflectivity reduction induced by a carbon layer at the mirror surface is a major problem in synchrotron radiation sources. A time-dependent photoelectron spectroscopy study of the chemical reactions which take place at the Pt(111) surface under operating conditions is presented. It is shown that the carbon contamination layer growth can be stopped and reversed by low partial pressures of oxygen for optics operated in intense photon beams at liquidnitrogen temperature. For mirrors operated at room temperature the carbon contamination observed for equivalent partial pressures of CO is reduced and the effects of oxygen are observed on a long time scale.