32 resultados para order- delivery process
em CentAUR: Central Archive University of Reading - UK
Resumo:
There is a lack of knowledge base in relation to experiences gained and lessons learnt from previously executed National Health Service (NHS) infrastructure projects in the UK. This is in part a feature of one-off construction projects, which typify healthcare infrastructure, and in part due to the absence of a suitable method for conveying such information. The complexity of infrastructure delivery process in the NHS makes the construction of healthcare buildings a formidable task. This is particularly the case for the NHS trusts who have little or no experience of construction projects. To facilitate understanding a most important aspect of the delivery process, which is the preparation of a capital investment proposal; steps taken in developing the business case for an NHS healthcare facility are examined. The context for such examination is provided by the planning process of a healthcare project, studied retrospectively. The process is analysed using a social science based method called ‘building stories’, developed at the University of California-Berkeley. By applying this method, stories or narratives are constructed around the data captured on the case study. The findings indicate that the business case process may be used to justify, rather than identify, trusts’ requirements. The study is useful for UK public sector clients as well as consultants and professionals who aim to participate in the delivery of healthcare infrastructure projects in the UK.
Resumo:
Many currently available drugs show unfavourable physicochemical properties for delivery into or across the skin and temporary chemical modulation of the penetrant is one option to achieve improved delivery properties. Pro-drugs are chemical derivatives of an active drug which is covalently bonded to an inactive pro-moiety in order to overcome pharmaceutical and pharmacokinetic barriers. A pro-drug relies upon conversion within the body to release the parent active drug (and pro-moiety) to elicit its pharmacological effect. The main drawback of this approach is that the pro-moiety is essentially an unwanted ballast which, when released, can lead to adverse effects. The term ‘co-drug’ refers to two or more therapeutic compounds active against the same disease bonded via a covalent chemical linkage and it is this approach which is reviewed for the first time in the current article. For topically applied co-drugs, each moiety is liberated in situ, either chemically or enzymatically, once the stratum corneum barrier has been overcome by the co-drug. Advantages include synergistic modulation of the disease process, enhancement of drug delivery and pharmacokinetic properties and the potential to enhance stability by masking of labile functional groups. The amount of published work on co-drugs is limited but the available data suggest the co-drug concept could provide a significant therapeutic improvement in dermatological diseases. However, the applicability of the co-drug approach is subject to strict limitations pertaining mainly to the availability of compatible moieties and physicochemical properties of the overall molecule.
Resumo:
The frequency of persistent atmospheric blocking events in the 40-yr ECMWF Re-Analysis (ERA-40) is compared with the blocking frequency produced by a simple first-order Markov model designed to predict the time evolution of a blocking index [defined by the meridional contrast of potential temperature on the 2-PVU surface (1 PVU ≡ 1 × 10−6 K m2 kg−1 s−1)]. With the observed spatial coherence built into the model, it is able to reproduce the main regions of blocking occurrence and the frequencies of sector blocking very well. This underlines the importance of the climatological background flow in determining the locations of high blocking occurrence as being the regions where the mean midlatitude meridional potential vorticity (PV) gradient is weak. However, when only persistent blocking episodes are considered, the model is unable to simulate the observed frequencies. It is proposed that this persistence beyond that given by a red noise model is due to the self-sustaining nature of the blocking phenomenon.
Resumo:
The research record on the quantification of sediment transport processes in periglacial mountain environments in Scandimvia dates back to the 1950s. A wide range of measurements is. available, especially from the Karkevagge region of northern Sweden. Within this paper satellite image analysis and tools provided by geographic information systems (GIS) are exploited in order to extend and improve this research and to complement geophysical methods. The processes of interest include mass movements such as solifluction, slope wash, dirty avalanches and rock-and boulder falls. Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing Field measurements are employed in the budget calculation. In the Karkevagge catch ment. 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Karkevagge is approximately 680 t a(-1) whilst about 150 : a-1 are transported into the fluvial System.
Resumo:
The delineation of Geomorphic Process Units (GPUs) aims to quantify past, current and future geomorphological processes and the sediment flux associated with them. Five GPUs have been identified for the Okstindan area of northern Norway and these were derived from the combination of Landsat satellite imagery (TM and ETM+) with stereo aerial photographs (used to construct a Digital Elevation Model) and ground survey. The Okstindan study area is sub-arctic and mountainous and is dominated by glacial and periglacial processes. The GPUs exclude the glacial system (some 37% of the study area) and hence they are focussed upon periglacial and colluvial processes. The identified GPUs are: 1. solifluction and rill erosion; 2. talus creep, slope wash and rill erosion; 3. accumulation of debris by rock and boulder fall; 4. rockwalls; and 5. stable ground with dissolved transport. The GPUs have been applied to a ‘test site’ within the study area in order to illustrate their potential for mapping the spatial distribution of geomorphological processes. The test site within the study area is a catchment which is representative of the range of geomorphological processes identified.
Resumo:
Planning is a vital element of project management but it is still not recognized as a process variable. Its objective should be to outperform the initially defined processes, and foresee and overcome possible undesirable events. Detailed task-level master planning is unrealistic since one cannot accurately predict all the requirements and obstacles before work has even started. The process planning methodology (PPM) has thus been developed in order to overcome common problems of the overwhelming project complexity. The essential elements of the PPM are the process planning group (PPG), including a control team that dynamically links the production/site and management, and the planning algorithm embodied within two continuous-improvement loops. The methodology was tested on a factory project in Slovenia and in four successive projects of a similar nature. In addition to a number of improvement ideas and enhanced communication, the applied PPM resulted in 32% higher total productivity, 6% total savings and created a synergistic project environment.
Resumo:
In the tender process, contractors often rely on subcontract and supply enquiries to calculate their bid prices. However, this integral part of the bidding process is not empirically articulated in the literature. Over 30 published materials on the tendering process of contractors that talk about enquiries were reviewed and found to be based mainly on experiential knowledge rather than systematic evidence. The empirical research here helps to describe the process of enquiries precisely, improve it in practice, and have some basis to support it in theory. Using a live participant observation case study approach, the whole tender process was shadowed in the offices of two of the top 20 UK civil engineering construction firms. This helped to investigate 15 research questions on how contractors enquire and obtain prices from subcontractors and suppliers. Forty-three subcontract enquiries and 18 supply enquiries were made across two different projects with average value of 7m. An average of 15 subcontract packages and seven supply packages was involved. Thus, two or three subcontractors or suppliers were invited to bid in each package. All enquiries were formulated by the estimator, with occasional involvement of three other personnel. Most subcontract prices were received in an average of 14 working days; and supply prices took five days. The findings show 10 main activities involved in processing enquiries and their durations, as well as wasteful practices associated with enquiries. Contractors should limit their enquiry invitations to a maximum of three per package, and optimize the waiting time for quotations in order to improve cost efficiency.
Resumo:
Managing a construction project supply chain effectively and efficiently is extremely difficult due to involvement of numerous sectors that are supported by ineffective communication system. An efficient construction supply chain system ensures the delivery of materials and other services to construction site while minimising costs and rewarding all sectors based on value added to the supply chain. The advancement of information, communication and wireless technologies is driving construction companies to deploy supply chain management strategies to seek better outputs. As part of the emerging wireless technologies, contextaware computing capability represents the next generation of ICT to the construction services. Conceptually, context-awareness could be integrated with Web Services in order to ensure the delivery of pertinent information to construction site and enhance construction supply chain collaboration. An initial study has indicated that this integrated system has the potential of serving and improving the construction services delivery through access to context-specific data, information and services on as-needed basis.
Resumo:
Elevated levels of low-density-lipoprotein cholesterol (LDL-C) in the plasma are a well-established risk factor for the development of coronary heart disease. Plasma LDL-C levels are in part determined by the rate at which LDL particles are removed from the bloodstream by hepatic uptake. The uptake of LDL by mammalian liver cells occurs mainly via receptor-mediated endocytosis, a process which entails the binding of these particles to specific receptors in specialised areas of the cell surface, the subsequent internalization of the receptor-lipoprotein complex, and ultimately the degradation and release of the ingested lipoproteins' constituent parts. We formulate a mathematical model to study the binding and internalization (endocytosis) of LDL and VLDL particles by hepatocytes in culture. The system of ordinary differential equations, which includes a cholesterol-dependent pit production term representing feedback regulation of surface receptors in response to intracellular cholesterol levels, is analysed using numerical simulations and steady-state analysis. Our numerical results show good agreement with in vitro experimental data describing LDL uptake by cultured hepatocytes following delivery of a single bolus of lipoprotein. Our model is adapted in order to reflect the in vivo situation, in which lipoproteins are continuously delivered to the hepatocyte. In this case, our model suggests that the competition between the LDL and VLDL particles for binding to the pits on the cell surface affects the intracellular cholesterol concentration. In particular, we predict that when there is continuous delivery of low levels of lipoproteins to the cell surface, more VLDL than LDL occupies the pit, since VLDL are better competitors for receptor binding. VLDL have a cholesterol content comparable to LDL particles; however, due to the larger size of VLDL, one pit-bound VLDL particle blocks binding of several LDLs, and there is a resultant drop in the intracellular cholesterol level. When there is continuous delivery of lipoprotein at high levels to the hepatocytes, VLDL particles still out-compete LDL particles for receptor binding, and consequently more VLDL than LDL particles occupy the pit. Although the maximum intracellular cholesterol level is similar for high and low levels of lipoprotein delivery, the maximum is reached more rapidly when the lipoprotein delivery rates are high. The implications of these results for the design of in vitro experiments is discussed.
Resumo:
Olive fruits of three of the most important Spanish and Italian cultivars, 'Picual', `Hojiblanca' and 'Frantoio', were harvested at bi-weekly periods during three crop seasons to study their development and ripening process. Fresh and dry weights and ripening index were determined for fruit, while dry matter, oil and moisture contents were determined in both fruit and pulp (flesh). Fruit growth rate and oil accumulation were calculated. Each olive cultivar showed a different ripening pattern, 'Hojiblanca' being the last one to maturate. Fruit weight increased, decreasing its growth rate from the middle of November. Dry matter and moisture contents decreased during ripening in pulp and fruit, 'Hojiblanca' showing the highest values for both. Oil content, when expressed on a fresh weight basis, increased in all cultivars, although for the last time period showed variations due to climatic conditions. During ripening, oil content on a dry weight basis increased in fruit, but oil biosynthesis in flesh ceased from November. Olive fruits presented lower oil and higher dry matter contents in the year of lowest rainfall. Therefore fruit harvesting should be carried out from the middle of November in order to obtain the highest oil yield and avoid natural fruit drop. (C) 2004 Society of Chemical Industry.
Resumo:
This paper describes a framework architecture for the automated re-purposing and efficient delivery of multimedia content stored in CMSs. It deploys specifically designed templates as well as adaptation rules based on a hierarchy of profiles to accommodate user, device and network requirements invoked as constraints in the adaptation process. The user profile provides information in accordance with the opt-in principle, while the device and network profiles provide the operational constraints such as for example resolution and bandwidth limitations. The profiles hierarchy ensures that the adaptation privileges the users' preferences. As part of the adaptation, we took into account the support for users' special needs, and therefore adopted a template-based approach that could simplify the adaptation process integrating accessibility-by-design in the template.
Resumo:
This paper presents the on-going research performed in order to integrate process automation and process management support in the context of media production. This has been addressed on the basis of a holistic approach to software engineering applied to media production modelling to ensure design correctness, completeness and effectiveness. The focus of the research and development has been to enhance the metadata management throughout the process in a similar fashion to that achieved in Decision Support Systems (DSS) to facilitate well-grounded business decisions. The paper sets out the aims and objectives and the methodology deployed. The paper describes the solution in some detail and sets out some preliminary conclusions and the planned future work.
Resumo:
The complexity of construction projects and the fragmentation of the construction industry undertaking those projects has effectively resulted in linear, uncoordinated and highly variable project processes in the UK construction sector. Research undertaken at the University of Salford resulted in the development of an improved project process, the Process Protocol, which considers the whole lifecycle of a construction project whilst integrating its participants under a common framework. The Process Protocol identifies the various phases of a construction project with particular emphasis on what is described in the manufacturing industry as the ‘fuzzy front end’. The participants in the process are described in terms of the activities that need to be undertaken in order to achieve a successful project and process execution. In addition, the decision-making mechanisms, from a client perspective, are illustrated and the foundations for a learning organization/industry are facilitated within a consistent Process Protocol.
Resumo:
This paper describes the novel use of cluster analysis in the field of industrial process control. The severe multivariable process problems encountered in manufacturing have often led to machine shutdowns, where the need for corrective actions arises in order to resume operation. Production faults which are caused by processes running in less efficient regions may be prevented or diagnosed using a reasoning based on cluster analysis. Indeed the intemal complexity of a production machinery may be depicted in clusters of multidimensional data points which characterise the manufacturing process. The application of a Mean-Tracking cluster algorithm (developed in Reading) to field data acquired from a high-speed machinery will be discussed. The objective of such an application is to illustrate how machine behaviour can be studied, in particular how regions of erroneous and stable running behaviour can be identified.
Resumo:
B. subtilis under certain types of media and fermentation conditions can produce surfactin, a biosurfactant which belongs to the lipopeptide class. Surfactin has exceptional surfactant activity, and exhibits some interesting biological characteristics such as antibacterial activity, antitumoral activity against ascites carcinoma cells, and a hypocholesterolemic activity that inhibits cAMP phosphodiesterase, as well as having anti-HIV properties. A cost effective recovery and purification of surfactin from fermentation broth using a two-step ultrafiltration (UF) process has been developed in order to reduce the cost of surfactin production. In this study, competitive adsorption of surfactin and proteins at the air-water interface was studied using surface pressure measurements. Small volumes of bovine serum albumin (BSA) and β-casein solutions were added to the air-water interface on a Langmuir trough and allowed to stabilise before the addition of surfactin to the subphase. Contrasting interfacial behaviour of proteins was observed with β-casein showing faster initial adsorption compared to BSA. On introduction of surfactin both proteins were displaced but a longer time were taken to displace β-casein. Overall the results showed surfactin were highly surface-active by forming a β-sheet structure at the air-water interface after reaching its critical micelle concentration (CMC) and were effective in removing both protein films, which can be explained following the orogenic mechanism. Results showed that the two-step UF process was effective to achieve high purity and fully functional surfactin.