965 resultados para Process analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to show the extent to which clients amend standard form contracts in practice, the locus of the amendments, and how contractors respond to the amendments when putting together a bid. Design/methodology/approach – Four live observational case studies were carried out in two of the top 20 UK construction firms. The whole process used to review the proposed terms and conditions of the contract was shadowed using participant observation, interview and documentary analysis. Findings – All four cases showed strong evidence of amendments relating mostly to payment and contractual aspects: 83 amendments in Case Study 1 (CS1), 80 in CS2, 15 in CS3 and 29 in CS4. This comprised clauses that were modified (37 per cent), substituted (23 per cent), deleted (7 per cent) and new additions (33 per cent). Risks inherent in the amendments were mostly addressed through contractual rather than price mechanisms, to reflect commercial imperatives. “Qualifications” and “clarifications” were included in the tender submissions for post-tender negotiations. Thus, the amendments did not necessarily influence price. There was no evidence of a “standard-form contract“ being used as such, although clients may draw on published “standard-form contracts” to derive the forms of contract actually used in practice. Practical implications – Contractors should pay attention to clauses relating to contractual and financial aspects when reviewing tender documents. Clients should draft equitable payment and contractual terms and conditions to reduce risk of dispute. Indeed, it is prudent for clients not to pass on inestimable risks. Originality/value – A better understanding of the extent and locus of amendments in standard form contracts, and how contractors respond, is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty contributes a major part in the accuracy of a decision-making process while its inconsistency is always difficult to be solved by existing decision-making tools. Entropy has been proved to be useful to evaluate the inconsistency of uncertainty among different respondents. The study demonstrates an entropy-based financial decision support system called e-FDSS. This integrated system provides decision support to evaluate attributes (funding options and multiple risks) available in projects. Fuzzy logic theory is included in the system to deal with the qualitative aspect of these options and risks. An adaptive genetic algorithm (AGA) is also employed to solve the decision algorithm in the system in order to provide optimal and consistent rates to these attributes. Seven simplified and parallel projects from a Hong Kong construction small and medium enterprise (SME) were assessed to evaluate the system. The result shows that the system calculates risk adjusted discount rates (RADR) of projects in an objective way. These rates discount project cash flow impartially. Inconsistency of uncertainty is also successfully evaluated by the use of the entropy method. Finally, the system identifies the favourable funding options that are managed by a scheme called SME Loan Guarantee Scheme (SGS). Based on these results, resource allocation could then be optimized and the best time to start a new project could also be identified throughout the overall project life cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A combined mathematical model for predicting heat penetration and microbial inactivation in a solid body heated by conduction was tested experimentally by inoculating agar cylinders with Salmonella typhimurium or Enterococcus faecium and heating in a water bath. Regions of growth where bacteria had survived after heating were measured by image analysis and compared with model predictions. Visualisation of the regions of growth was improved by incorporating chromogenic metabolic indicators into the agar. Preliminary tests established that the model performed satisfactorily with both test organisms and with cylinders of different diameter. The model was then used in simulation studies in which the parameters D, z, inoculum size, cylinder diameter and heating temperature were systematically varied. These simulations showed that the biological variables D, z and inoculum size had a relatively small effect on the time needed to eliminate bacteria at the cylinder axis in comparison with the physical variables heating temperature and cylinder diameter, which had a much greater relative effect. (c) 2005 Elsevier B.V All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Epidemiological studies suggest that soy consumption contributes to the prevention of coronary heart disease. The proposed anti-atherogenic effects of soy appear to be carried by the soy isoflavones with genistein as the most abundant compound. Aim of the study To identify proteins or pathways by which genistein might exert its protective activities on atherosclerosis, we analyzed the proteomic response of primary human umbilical vein endothelial cells ( HUVEC) that were exposed to the pro-atherosclerotic stressors homocysteine or oxidized low-density lipoprotein (ox-LDL). Methods HUVEC were incubated with physiological concentrations of homocysteine or ox-LDL in the absence and presence of genistein at concentrations that can be reached in human plasma by a diet rich in soy products (2.5 muM) or by pharmacological intervention ( 25 muM). Proteins from HUVEC were separated by two-dimensional polyacrylamide gel electrophoresis and those that showed altered expression level upon genistein treatment were identified by peptide mass fingerprints derived from tryptic digests of the protein spots. Results Several proteins were found to be differentially affected by genistein. The most interesting proteins that were potently decreased by homocysteine treatment were annexin V and lamin A. Annexin V is an antithrombotic molecule and mutations in nuclear lamin A have been found to result in perturbations of plasma lipids associated with hypertension. Genistein at low and high concentrations reversed the stressor-induced decrease of these anti-atherogenic proteins. Ox-LDL treatment of HUVEC resulted in an increase in ubiquitin conjugating enzyme 12, a protein involved in foam cell formation. Treatment with genistein at both doses reversed this effect. Conclusions Proteome analysis allows the identification of potential interactions of dietary components in the molecular process of atherosclerosis and consequently provides a powerful tool to define biomarkers of response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accumulation of advanced glycation end-products (AGEs) on proteins is associated with the development of diabetic complications. Although the overall extent of modification of protein by AGEs is limited, localization of these modifications at a few critical sites might have a significant effect on protein structure and function. In the present study, we describe the sites of modification of RNase by glyoxal under physiological conditions. Arg(39) and Arg(85), which are closest to the active site of the enzyme, were identified as the primary sites of formation of the glyoxal-derived dihydroxyimidazolidine and hydroimidazolone adducts. Lower amounts of modification were detected at Arg(10), while Arg(33) appeared to be unmodified. We conclude that dihydroxyimidazolidine adducts are the primary products of modification of protein by glyoxal, that Arg(39) and Arg(85) are the primary sites of modification of RNase by glyoxal, and that modification of arginine residues during Maillard reactions of proteins is a highly selective process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a conceptual architecture for a Group Support System (GSS) to facilitate Multi-Organisational Collaborative Groups (MOCGs) initiated by local government and including external organisations of various types. Multi-Organisational Collaborative Groups (MOCGs) consist of individuals from several organisations which have agreed to work together to solve a problem. The expectation is that more can be achieved working in harmony than separately. Work is done interdependently, rather than independently in diverse directions. Local government, faced with solving complex social problems, deploy MOCGs to enable solutions across organisational, functional, professional and juridical boundaries, by involving statutory, voluntary, community, not-for-profit and private organisations. This is not a silver bullet as it introduces new pressures. Each member organisation has its own goals, operating context and particular approaches, which can be expressed as their norms and business processes. Organisations working together must find ways of eliminating differences or mitigating their impact in order to reduce the risks of collaborative inertia and conflict. A GSS is an electronic collaboration system that facilitates group working and can offer assistance to MOCGs. Since many existing GSSs have been primarily developed for single organisation collaborative groups, even though there are some common issues, there are some difficulties peculiar to MOCGs, and others that they experience to a greater extent: a diversity of primary organisational goals among members; different funding models and other pressures; more significant differences in other information systems both technologically and in their use than single organisations; greater variation in acceptable approaches to solve problems. In this paper, we analyse the requirements of MOCGs led by local government agencies, leading to a conceptual architecture for an e-government GSS that captures the relationships between 'goal', 'context', 'norm', and 'business process'. Our models capture the dynamics of the circumstances surrounding each individual representing an organisation in a MOCG along with the dynamics of the MOCG itself as a separate community.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall operation and internal complexity of a particular production machinery can be depicted in terms of clusters of multidimensional points which describe the process states, the value in each point dimension representing a measured variable from the machinery. The paper describes a new cluster analysis technique for use with manufacturing processes, to illustrate how machine behaviour can be categorised and how regions of good and poor machine behaviour can be identified. The cluster algorithm presented is the novel mean-tracking algorithm, capable of locating N-dimensional clusters in a large data space in which a considerable amount of noise is present. Implementation of the algorithm on a real-world high-speed machinery application is described, with clusters being formed from machinery data to indicate machinery error regions and error-free regions. This analysis is seen to provide a promising step ahead in the field of multivariable control of manufacturing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There is general agreement across all interested parties that a process of working together is the best way to determine which school or educational setting is right for an individual child with autism spectrum disorder. In the UK, families and local authorities both desire a constructive working relationship and see this as the best means by which to reach an agreement to determine where a child should be educated. It has been shown in published works 1 1. Batten and colleagues (Make schools make sense. Autism and education: the reality for families today; London: The National Autistic Society, 2006). View all notes that a constructive working relationship is not always achieved. Purpose: This small-scale study aims to explore the views of both parents and local authorities, focussing on how both parties perceive and experience the process of determining educational provision for children with autism spectrum disorders (ASD) within an English context. Sample, design and method: Parental opinion was gathered through the use of a questionnaire with closed and open responses. The questionnaire was distributed to two national charities, two local charities and 16 specialist schools, which offered the questionnaire to parents of children with ASD, resulting in an opportunity sample of 738 returned surveys. The views of local authority personnel from five local authorities were gathered through the use of semi-structured interviews. Data analyses included quantitative analysis of the closed response questionnaire items, and theme-based qualitative analysis of the open responses and interviews with local authority personnel. Results: In the majority of cases, parents in the survey obtained their first choice placement for their child. Despite this positive outcome, survey data indicated that parents found the process bureaucratic, stressful and time consuming. Parents tended to perceive alternative placement suggestions as financially motivated rather than in the best interests of the child. Interviews with local authority personnel showed an awareness of these concerns and the complex considerations involved in determining what is best for an individual child. Conclusions: This small-scale study highlights the need for more effective communication between parents of children with ASDs and local authority personnel at all stages of the process

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of entropy has been useful in evaluating inconsistency on human judgments. This paper illustrates an entropy-based decision support system called e-FDSS to the solution of multicriterion risk and decision analysis in projects of construction small and medium enterprises (SMEs). It is optimized and solved by fuzzy logic, entropy, and genetic algorithms. A case study demonstrated the use of entropy in e-FDSS on analyzing multiple risk criteria in the predevelopment stage of SME projects. Survey data studying the degree of impact of selected project risk criteria on different projects were input into the system in order to evaluate the preidentified project risks in an impartial environment. Without taking into account the amount of uncertainty embedded in the evaluation process; the results showed that all decision vectors are indeed full of bias and the deviations of decisions are finally quantified providing a more objective decision and risk assessment profile to the stakeholders of projects in order to search and screen the most profitable projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a unified framework for a range of linear transforms that can be used for the analysis of terahertz spectroscopic data, with particular emphasis on their application to the measurement of leaf water content. The use of linear transforms for filtering, regression, and classification is discussed. For illustration, a classification problem involving leaves at three stages of drought and a prediction problem involving simulated spectra are presented. Issues resulting from scaling the data set are discussed. Using Lagrange multipliers, we arrive at the transform that yields the maximum separation between the spectra and show that this optimal transform is equivalent to computing the Euclidean distance between the samples. The optimal linear transform is compared with the average for all the spectra as well as with the Karhunen–Loève transform to discriminate a wet leaf from a dry leaf. We show that taking several principal components into account is equivalent to defining new axes in which data are to be analyzed. The procedure shows that the coefficients of the Karhunen–Loève transform are well suited to the process of classification of spectra. This is in line with expectations, as these coefficients are built from the statistical properties of the data set analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes the paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate appropriate and diverse range of keyphrases that reflect the document. This paper proposes a solution that examines the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Since the early work by Geltner (1989), many papers have been written on this topic but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraised-based index. To investigate this issue in more detail we analyse a sample of individual property level appraisal data from the Investment Property Database (IPD). We find that commonly used unsmoothing estimates overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we investigate the commonly used autoregressive filter method of adjusting appraisal-based real estate returns to correct for the perceived biases induced in the appraisal process. Many articles have been written on appraisal smoothing but remarkably few have considered the relationship between smoothing at the individual property level and the amount of persistence in the aggregate appraisal-based index. To investigate this issue we analyze a large sample of appraisal data at the individual property level from the Investment Property Databank. We find that commonly used unsmoothing estimates at the index level overstate the extent of smoothing that takes place at the individual property level. There is also strong support for an ARFIMA representation of appraisal returns at the index level and an ARMA model at the individual property level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.