985 resultados para Geotechnical charts
Resumo:
A model for the joint economic design of X̄ and R control charts is developed. This model assumes that the process is subject to two assignable causes. One assignable cause shifts the process mean; the other shifts the process variance. The occurrence of the assignable cause of one kind does not block the occurrence of the assignable cause of another kind. Consequently, a second process parameter can go out-of-control after the first process parameter has gone out-of-control. A numerical study of the cost surface to the model considered has revealed that it is convex, at least in the interest region.
Resumo:
Recent studies have shown that the X̄chart with variable parameters (Vp X̄ chart) detects process shifts faster than the traditional X̄ chart. This article extends these studies for processes that are monitored by both, X̄ and R charts. Basically, the X̄ and R values establish if the control should or should not be relaxed. When the X̄ and R values fall in the central region the control is relaxed because one will wait more to take the next sample and/or the next sample will be smaller than usual. When the X̄ or R values fall in the warning region the control is tightened because one will wait less to take the next sample and the next sample will be larger than usual. The action limits are also made variable. This paper proposes to draw the action limits (for both charts) wider than usual, when the control is relaxed and narrower than usual when the control is tightened. The Vp feature improves the joint X̄ and R control chart performance in terms of the speed with which the process mean and/or variance shifts are detected. © 1998 IIE.
Resumo:
The cone penetration test (CPT), together with its recent variation (CPTU), has become the most widely used in-situ testing technique for soil profiling and geotechnical characterization. The knowledge gained over the last decades on the interpretation procedures in sands and clays is certainly wide, whilst very few contributions can be found as regards the analysis of CPT(u) data in intermediate soils. Indeed, it is widely accepted that at the standard rate of penetration (v = 20 mm/s), drained penetration occurs in sands while undrained penetration occurs in clays. However, a problem arise when the available interpretation approaches are applied to cone measurements in silts, sandy silts, silty or clayey sands, since such intermediate geomaterials are often characterized by permeability values within the range in which partial drainage is very likely to occur. Hence, the application of the available and well-established interpretation procedures, developed for ‘standard’ clays and sands, may result in invalid estimates of soil parameters. This study aims at providing a better understanding on the interpretation of CPTU data in natural sand and silt mixtures, by taking into account two main aspects, as specified below: 1)Investigating the effect of penetration rate on piezocone measurements, with the aim of identifying drainage conditions when cone penetration is performed at a standard rate. This part of the thesis has been carried out with reference to a specific CPTU database recently collected in a liquefaction-prone area (Emilia-Romagna Region, Italy). 2)Providing a better insight into the interpretation of piezocone tests in the widely studied silty sediments of the Venetian lagoon (Italy). Research has focused on the calibration and verification of some site-specific correlations, with special reference to the estimate of compressibility parameters for the assessment of long-term settlements of the Venetian coastal defences.
Resumo:
The uncertainties in the determination of the stratigraphic profile of natural soils is one of the main problems in geotechnics, in particular for landslide characterization and modeling. The study deals with a new approach in geotechnical modeling which relays on a stochastic generation of different soil layers distributions, following a boolean logic – the method has been thus called BoSG (Boolean Stochastic Generation). In this way, it is possible to randomize the presence of a specific material interdigitated in a uniform matrix. In the building of a geotechnical model it is generally common to discard some stratigraphic data in order to simplify the model itself, assuming that the significance of the results of the modeling procedure would not be affected. With the proposed technique it is possible to quantify the error associated with this simplification. Moreover, it could be used to determine the most significant zones where eventual further investigations and surveys would be more effective to build the geotechnical model of the slope. The commercial software FLAC was used for the 2D and 3D geotechnical model. The distribution of the materials was randomized through a specifically coded MatLab program that automatically generates text files, each of them representing a specific soil configuration. Besides, a routine was designed to automate the computation of FLAC with the different data files in order to maximize the sample number. The methodology is applied with reference to a simplified slope in 2D, a simplified slope in 3D and an actual landslide, namely the Mortisa mudslide (Cortina d’Ampezzo, BL, Italy). However, it could be extended to numerous different cases, especially for hydrogeological analysis and landslide stability assessment, in different geological and geomorphological contexts.
Resumo:
This project resulted in a chart illustrating connections in Social Networks.
Resumo:
Bovine spongiform encephalopathy (BSE) rapid tests and routine BSE-testing laboratories underlie strict regulations for approval. Due to the lack of BSE-positive control samples, however, full assay validation at the level of individual test runs and continuous monitoring of test performance on-site is difficult. Most rapid tests use synthetic prion protein peptides, but it is not known to which extend they reflect the assay performance on field samples, and whether they are sufficient to indicate on-site assay quality problems. To address this question we compared the test scores of the provided kit peptide controls to those of standardized weak BSE-positive tissue samples in individual test runs as well as continuously over time by quality control charts in two widely used BSE rapid tests. Our results reveal only a weak correlation between the weak positive tissue control and the peptide control scores. We identified kit-lot related shifts in the assay performances that were not reflected by the peptide control scores. Vice versa, not all shifts indicated by the peptide control scores indeed reflected a shift in the assay performance. In conclusion these data highlight that the use of the kit peptide controls for continuous quality control purposes may result in unjustified rejection or acceptance of test runs. However, standardized weak positive tissue controls in combination with Shewhart-CUSUM control charts appear to be reliable in continuously monitoring assay performance on-site to identify undesired deviations.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20% or 40% of patients in seven cohorts of patients starting ART in South Africa, and plotted cut-offs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia and the Asia-Pacific. FINDINGS 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African, from 64% to 93% in the Zambian and from 73% to 96% in the Asia-Pacific cohorts. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia and from 37% to 71% in Asia-Pacific. The area under the receiver-operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia and from 0.77 to 0.92 in Asia Pacific. INTERPRETATION CD4-based risk charts with optimal cut-offs for targeted VL testing may be useful to monitor ART in settings where VL capacity is limited.
Resumo:
BACKGROUND HIV-1 RNA viral load (VL) testing is recommended to monitor antiretroviral therapy (ART) but not available in many resource-limited settings. We developed and validated CD4-based risk charts to guide targeted VL testing. METHODS We modeled the probability of virologic failure up to 5 years of ART based on current and baseline CD4 counts, developed decision rules for targeted VL testing of 10%, 20%, or 40% of patients in 7 cohorts of patients starting ART in South Africa, and plotted cutoffs for VL testing on colour-coded risk charts. We assessed the accuracy of risk chart-guided VL testing to detect virologic failure in validation cohorts from South Africa, Zambia, and the Asia-Pacific. RESULTS In total, 31,450 adult patients were included in the derivation and 25,294 patients in the validation cohorts. Positive predictive values increased with the percentage of patients tested: from 79% (10% tested) to 98% (40% tested) in the South African cohort, from 64% to 93% in the Zambian cohort, and from 73% to 96% in the Asia-Pacific cohort. Corresponding increases in sensitivity were from 35% to 68% in South Africa, from 55% to 82% in Zambia, and from 37% to 71% in Asia-Pacific. The area under the receiver operating curve increased from 0.75 to 0.91 in South Africa, from 0.76 to 0.91 in Zambia, and from 0.77 to 0.92 in Asia-Pacific. CONCLUSIONS CD4-based risk charts with optimal cutoffs for targeted VL testing maybe useful to monitor ART in settings where VL capacity is limited.
Resumo:
This investigation compares two different methodologies for calculating the national cost of epilepsy: provider-based survey method (PBSM) and the patient-based medical charts and billing method (PBMC&BM). The PBSM uses the National Hospital Discharge Survey (NHDS), the National Hospital Ambulatory Medical Care Survey (NHAMCS) and the National Ambulatory Medical Care Survey (NAMCS) as the sources of utilization. The PBMC&BM uses patient data, charts and billings, to determine utilization rates for specific components of hospital, physician and drug prescriptions. ^ The 1995 hospital and physician cost of epilepsy is estimated to be $722 million using the PBSM and $1,058 million using the PBMC&BM. The difference of $336 million results from $136 million difference in utilization and $200 million difference in unit cost. ^ Utilization. The utilization difference of $136 million is composed of an inpatient variation of $129 million, $100 million hospital and $29 million physician, and an ambulatory variation of $7 million. The $100 million hospital variance is attributed to inclusion of febrile seizures in the PBSM, $−79 million, and the exclusion of admissions attributed to epilepsy, $179 million. The former suggests that the diagnostic codes used in the NHDS may not properly match the current definition of epilepsy as used in the PBMC&BM. The latter suggests NHDS errors in the attribution of an admission to the principal diagnosis. ^ The $29 million variance in inpatient physician utilization is the result of different per-day-of-care physician visit rates, 1.3 for the PBMC&BM versus 1.0 for the PBSM. The absence of visit frequency measures in the NHDS affects the internal validity of the PBSM estimate and requires the investigator to make conservative assumptions. ^ The remaining ambulatory resource utilization variance is $7 million. Of this amount, $22 million is the result of an underestimate of ancillaries in the NHAMCS and NAMCS extrapolations using the patient visit weight. ^ Unit cost. The resource cost variation is $200 million, inpatient is $22 million and ambulatory is $178 million. The inpatient variation of $22 million is composed of $19 million in hospital per day rates, due to a higher cost per day in the PBMC&BM, and $3 million in physician visit rates, due to a higher cost per visit in the PBMC&BM. ^ The ambulatory cost variance is $178 million, composed of higher per-physician-visit costs of $97 million and higher per-ancillary costs of $81 million. Both are attributed to the PBMC&BM's precise identification of resource utilization that permits accurate valuation. ^ Conclusion. Both methods have specific limitations. The PBSM strengths are its sample designs that lead to nationally representative estimates and permit statistical point and confidence interval estimation for the nation for certain variables under investigation. However, the findings of this investigation suggest the internal validity of the estimates derived is questionable and important additional information required to precisely estimate the cost of an illness is absent. ^ The PBMC&BM is a superior method in identifying resources utilized in the physician encounter with the patient permitting more accurate valuation. However, the PBMC&BM does not have the statistical reliability of the PBSM; it relies on synthesized national prevalence estimates to extrapolate a national cost estimate. While precision is important, the ability to generalize to the nation may be limited due to the small number of patients that are followed. ^
Resumo:
Delineating the interrelationships between tectonics, sedimentation, and geotechnical properties is particularly important for areas subjected to the dynamic affects of convergence. DSDP Leg 66 drilling within the Middle America Trench complex provided a unique opportunity to investigate these interrelationships along a transect of eight drill sites beginning on the trench outer slope and traversing the trench, trench inner slope, and upper continental slope. Investigations of other convergent margins suggest that deformation occurs most rapidly along the lower trench inner slope and is reflected by the geotechnical properties (Carson, 1977; Seely, 1977; and von Huene, 1979). This study focuses on the geotechnical properties of Middle America Trench sediments and the possible affects of convergence on these properties.
Resumo:
Sediment composition and rate of deposition are the primary factors responsible for determining the spatial distribution of geotechnical properties on the Wring Plateau. Grain size and depth of burial have no significant influence. Vertical and lateral changes in geotechnical properties are associated with vertical and lateral composition changes in which biogenic silica is the most important variable. Anomalous trends of decreasing density and increasing porosity and water content with depth are associated with increasing silica content downsection. Void ratios, inferred in-situ permeability, and change in void ratio during consolidation testing are relatively high in siliceous sediments and tend to increase as the biogenic silica content increases. Portions of the section are overconsolidated, probably as a result of changes in sediment accumulation rates. However, the higher permeabilities of siliceous sediments may also be a factor influencing consolidation state.
Resumo:
The geotechnical characteristics of 22 sediment samples from Leg 84 sites were studied in an effort to associate these with processes active along the Middle America slope and with sedimentation mechanisms. Geotechnical properties measured include water content, porosity, bulk density, Atterberg limits, consolidation characteristics, permeability, and vane shear strength. A majority of samples obtained from Sites 565, 568, and 570 show significant disturbance resulting from degassing. This disturbance apparently results in underconsolidation, although other mechanisms such as excess pore pressures generated from the subduction process can also contribute to this state. Overconsolidated sediments were found at Sites 565, 566, and 569. The overconsolidated sediments at Sites 565 and 569 may result from downslope transport mechanisms rearranging and stressing the sediment mass under consideration. The sediment condition at Site 566 is probably a result of eroded overburden: an estimated 87 m of overlying sediments may have been removed. Geotechnical and permeability relationships with depth are consistent with those found for other hemipelagic sediments of silty clay to clayey silt textures.
Resumo:
The hydraulic piston coring device (HPC-15) allows recovery of deep ocean sediments with minimal disturbance. The device was used during Leg 72 of the Deep Sea Drilling Project (DSDP) aboard the Glomar Challenger. Core samples were recovered from bore holes in the Rio Grande Rise in the southwest Atlantic Ocean. Relatively undisturbed sediment cores were obtained from Holes 515A, 516, 517, and 518. The results of shipboard physical property measurements and on-shore geotechnical laboratory tests on these cores are presented in this chapter. A limited number of 0.3 m cores were obtained and used in a series of geotechnical tests, including one-dimensional consolidation, direct shear, Atterburg limit, particle size analysis, and specific gravity tests. Throughout the testing program, attention was focused on assessment of sample disturbance associated with the HPC-15 coring device. The HPC-15 device limits sample disturbance reasonably well in terrigenous muds (clays). However, sample disturbance associated with coring calcareous sediments (nannofossil-foraminifer oozes) is severe. The noncohesive, granular behavior of the calcareous sediments is vulnerable to severe disturbance, because of the design of the sampling head on the device at the time of Leg 72. A number of modifications to the sampling head design are recommended and discussed in this chapter. The modifications will improve sample quality for testing purposes and provide longer unbroken core samples by reducing friction between the sediment column and the sampling tool.
Resumo:
This paper presents a geotechnical characterization of the glacigenic sediments in Prydz Bay, East Antarctica, based on the shipboard physical properties data obtained during Leg 119, combined with results of land-based analyses of 24 whole-round core samples. Main emphasis is placed on the land-based studies, which included oedometer consolidation tests, triaxial and simple shear tests for undrained shear strength, permeability tests in oedometer and triaxial cell, Atterberg limits, and grain-size analyses. The bulk of the tested sediments comprise overconsolidated diamictites of a relatively uniform lithology. The overconsolidation results from a combination of glacial loading and sediment overburden subsequently removed by extensive glacial erosion of the shelf. This leads to downhole profiles of physical properties that have been observed not to change as a function of the thickness of present overburden. A number of fluctuations in the parameters shows a relatively systematic trend and most likely results from changes in the proximity to the ice sheet grounding line in response to variations in the glacial regime. Very low permeabilities mainly result from high preconsolidation stresses (Pc'). Pc' values up to 10,000 kPa were estimated from the oedometer tests, and empirical estimates based on undrained shear strengths (up to 2500 kPa) indicate that the oedometer results are conservative. The diamictites generally classify as inactive, of low to medium plasticity, and they consolidate with little deformation, even when subjected to great stresses. This is the first report of geotechnical data from deep boreholes on the Antarctic continental shelf, but material of similar character can also be expected in other areas around the Antarctic.