928 resultados para critical path methods
Resumo:
Cover title.
Resumo:
Cover title
Resumo:
Accompanied by "Supplement no.1- to DOD and NASA guide: PERT COST: output reports [by] PERT Coordinating Group." (v.) Published: [Washington, For sale by the Supt. of Docs., U.S. Govt. Print. Off.] 1963-
Resumo:
Bibliography: p. 203-206.
Resumo:
This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.
Resumo:
Parkinson's disease is a complex heterogeneous disorder with urgent need for disease-modifying therapies. Progress in successful therapeutic approaches for PD will require an unprecedented level of collaboration. At a workshop hosted by Parkinson's UK and co-organized by Critical Path Institute's (C-Path) Coalition Against Major Diseases (CAMD) Consortiums, investigators from industry, academia, government and regulatory agencies agreed on the need for sharing of data to enable future success. Government agencies included EMA, FDA, NINDS/NIH and IMI (Innovative Medicines Initiative). Emerging discoveries in new biomarkers and genetic endophenotypes are contributing to our understanding of the underlying pathophysiology of PD. In parallel there is growing recognition that early intervention will be key for successful treatments aimed at disease modification. At present, there is a lack of a comprehensive understanding of disease progression and the many factors that contribute to disease progression heterogeneity. Novel therapeutic targets and trial designs that incorporate existing and new biomarkers to evaluate drug effects independently and in combination are required. The integration of robust clinical data sets is viewed as a powerful approach to hasten medical discovery and therapies, as is being realized across diverse disease conditions employing big data analytics for healthcare. The application of lessons learned from parallel efforts is critical to identify barriers and enable a viable path forward. A roadmap is presented for a regulatory, academic, industry and advocacy driven integrated initiative that aims to facilitate and streamline new drug trials and registrations in Parkinson's disease.
Resumo:
A pert-type system, a combination of the program evaluation and review technique (PERT) and the critical path method (CPM), might be used by the hospitality industry to improve planning and control of complex functions. The author discusses this management science technique and how it can assist.
Resumo:
This report concerns the stabilization of three crushed limestones by an ss-1 asphalt emulsion and an asphalt cement, 120-150 penetration. Stabilization is evaluated by marshall stability and triaxial shear tests. Test specimens were compacted by the marshall, standard proctor and vibratory methods. Stabilization is evaluated primarily by triaxial shear tests in which confining pressures of 0 to 80 psi were used. Data were obtained on the angle of internal friction, cohesion, volume change, pore water pressure and strain characteristics of the treated and untreated aggregates. The MOHR envelope, bureau of reclamation and modified stress path methods were used to determine shear strength parameters at failure. Several significant conclusions developed by the authors are as follows: (1) the values for effective angle of internal friction and effective cohesion were substantially independent of asphalt content, (2) straight line MOHR envelopes of failure were observed for all treated stones, (3) bituminous admixtures did little to improve volume change (deformation due to load) characteristics of the three crushed limestones, (4) with respect to pore water characteristics (pore pressures and suctions due to lateral loading), bituminous treatment notably improved only the bedford stone, and (5) at low lateral pressures bituminous treatments increased stability by limiting axial strain. This would reduce rutting of highway bases. At high lateral pressures treated stone was less stable than untreated stone.
Resumo:
In the present study we elaborated algorithms by using concepts from percolation theory which analyze the connectivity conditions in geological models of petroleum reservoirs. From the petrophysical parameters such as permeability, porosity, transmittivity and others, which may be generated by any statistical process, it is possible to determine the portion of the model with more connected cells, what the interconnected wells are, and the critical path between injector and source wells. This allows to classify the reservoir according to the modeled petrophysical parameters. This also make it possible to determine the percentage of the reservoir to which each well is connected. Generally, the connected regions and the respective minima and/or maxima in the occurrence of the petrophysical parameters studied constitute a good manner to characterize a reservoir volumetrically. Therefore, the algorithms allow to optimize the positioning of wells, offering a preview of the general conditions of the given model s connectivity. The intent is not to evaluate geological models, but to show how to interpret the deposits, how their petrophysical characteristics are spatially distributed, and how the connections between the several parts of the system are resolved, showing their critical paths and backbones. The execution of these algorithms allows us to know the properties of the model s connectivity before the work on reservoir flux simulation is started
Resumo:
High Energy efficiency and high performance are the key regiments for Internet of Things (IoT) end-nodes. Exploiting cluster of multiple programmable processors has recently emerged as a suitable solution to address this challenge. However, one of the main bottlenecks for multi-core architectures is the instruction cache. While private caches fall into data replication and wasting area, fully shared caches lack scalability and form a bottleneck for the operating frequency. Hence we propose a hybrid solution where a larger shared cache (L1.5) is shared by multiple cores connected through a low-latency interconnect to small private caches (L1). However, it is still limited by large capacity miss with a small L1. Thus, we propose a sequential prefetch from L1 to L1.5 to improve the performance with little area overhead. Moreover, to cut the critical path for better timing, we optimized the core instruction fetch stage with non-blocking transfer by adopting a 4 x 32-bit ring buffer FIFO and adding a pipeline for the conditional branch. We present a detailed comparison of different instruction cache architectures' performance and energy efficiency recently proposed for Parallel Ultra-Low-Power clusters. On average, when executing a set of real-life IoT applications, our two-level cache improves the performance by up to 20% and loses 7% energy efficiency with respect to the private cache. Compared to a shared cache system, it improves performance by up to 17% and keeps the same energy efficiency. In the end, up to 20% timing (maximum frequency) improvement and software control enable the two-level instruction cache with prefetch adapt to various battery-powered usage cases to balance high performance and energy efficiency.
Resumo:
Technical evaluation of analytical data is of extreme relevance considering it can be used for comparisons with environmental quality standards and decision-making as related to the management of disposal of dredged sediments and the evaluation of salt and brackish water quality in accordance with CONAMA 357/05 Resolution. It is, therefore, essential that the project manager discusses the environmental agency's technical requirements with the laboratory contracted for the follow-up of the analysis underway and even with a view to possible re-analysis when anomalous data are identified. The main technical requirements are: (1) method quantitation limits (QLs) should fall below environmental standards; (2) analyses should be carried out in laboratories whose analytical scope is accredited by the National Institute of Metrology (INMETRO) or qualified or accepted by a licensing agency; (3) chain of custody should be provided in order to ensure sample traceability; (4) control charts should be provided to prove method performance; (5) certified reference material analysis or, if that is not available, matrix spike analysis, should be undertaken and (6) chromatograms should be included in the analytical report. Within this context and with a view to helping environmental managers in analytical report evaluation, this work has as objectives the discussion of the limitations of the application of SW 846 US EPA methods to marine samples, the consequences of having data based on method detection limits (MDL) and not sample quantitation limits (SQL), and present possible modifications of the principal method applied by laboratories in order to comply with environmental quality standards.
Resumo:
This in vivo study evaluated the osteogenic potential of two proteins, recombinant human bone morphogenetic protein-2 (rhBMP-2) and a protein extracted from natural latex (Hevea brasiliensis, P-1), and compared their effects on bone defects when combined with a carrier or a collagen gelatin. Eighty-four (84) Wistar rats were divided into two groups, with and without the use of collagen gelatin, and each of these were divided into six treatment groups of seven animals each. The treatment groups were: (1) 5 mu g of pure rhBMP-2; (2) 5 mu g of rhBMP-2/monoolein gel; (3) pure monoolein gel; (4) 5 mu g of pure P-1; (5) 5 mu g of P-1/monoolein gel; (6) critical bone defect control. The animals were anesthetized and a 6 mm diameter critical bone defect was made in the left posterior region of the parietal bone. Animals were submitted to intracardiac perfusion after 4 weeks and the calvaria tissue was removed for histomorphometric analysis. In this experimental study, it was concluded that rhBMP-2 allowed greater new bone formation than P-1 protein and this process was more effective when the bone defect was covered with collagen gelatin (P < 0.05). Anat Rec, 293:794-801, 2010. (C) 2010 Wiley-Liss, Inc.
Resumo:
Objective. The goal of this paper is to undertake a literature search collecting all dentin bond strength data obtained for six adhesives with four tests ( shear, microshear, tensile and microtensile) and to critically analyze the results with respect to average bond strength, coefficient of variation, mode of failure and product ranking. Method. A PubMed search was carried out for the years between 1998 and 2009 identifying publications on bond strength measurements of resin composite to dentin using four tests: shear, tensile, microshear and microtensile. The six adhesive resins were selected covering three step systems ( OptiBond FL, Scotch Bond Multi-Purpose Plus), two-step (Prime & Bond NT, Single Bond, Clear. l SE Bond) and one step (Adper Prompt L Pop). Results. Pooling results from 147 references showed an ongoing high scatter in the bond strength data regardless which adhesive and which bond test was used. Coefficients of variation remained high (20-50%) even with the microbond test. The reported modes of failure for all tests still included high number of cohesive failures. The ranking seemed to be dependant on the test used. Significance. The scatter in dentin bond strength data remains regardless which test is used confirming Finite Element Analysis predicting non-uniform stress distributions due to a number of geometrical, loading, material properties and specimens preparation variables. This reopens the question whether, an interfacial fracture mechanics approach to analyze the dentin - adhesive bond is not more appropriate for obtaining better agreement among dentin bond related papers. (C) 2009 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Resumo:
Under the framework of constraint based modeling, genome-scale metabolic models (GSMMs) have been used for several tasks, such as metabolic engineering and phenotype prediction. More recently, their application in health related research has spanned drug discovery, biomarker identification and host-pathogen interactions, targeting diseases such as cancer, Alzheimer, obesity or diabetes. In the last years, the development of novel techniques for genome sequencing and other high-throughput methods, together with advances in Bioinformatics, allowed the reconstruction of GSMMs for human cells. Considering the diversity of cell types and tissues present in the human body, it is imperative to develop tissue-specific metabolic models. Methods to automatically generate these models, based on generic human metabolic models and a plethora of omics data, have been proposed. However, their results have not yet been adequately and critically evaluated and compared. This work presents a survey of the most important tissue or cell type specific metabolic model reconstruction methods, which use literature, transcriptomics, proteomics and metabolomics data, together with a global template model. As a case study, we analyzed the consistency between several omics data sources and reconstructed distinct metabolic models of hepatocytes using different methods and data sources as inputs. The results show that omics data sources have a poor overlapping and, in some cases, are even contradictory. Additionally, the hepatocyte metabolic models generated are in many cases not able to perform metabolic functions known to be present in the liver tissue. We conclude that reliable methods for a priori omics data integration are required to support the reconstruction of complex models of human cells.