163 resultados para structured data


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Seven hundred and nineteen samples from throughout the Cainozoic section in CRP-3 were analysed by a Malvern Mastersizer laser particle analyser, in order to derive a stratigraphic distribution of grain-size parameters downhole. Entropy analysis of these data (using the method of Woolfe and Michibayashi, 1995) allowed recognition of four groups of samples, each group characterised by a distinctive grain-size distribution. Group 1, which shows a multi-modal distribution, corresponds to mudrocks, interbedded mudrock/sandstone facies, muddy sandstones and diamictites. Group 2, with a sand-grade mode but showing wide dispersion of particle size, corresponds to muddy sandstones, a few cleaner sandstones and some conglomerates. Group 3 and Group 4 are also sand-dominated, with better grain-size sorting, and correspond to clean, well-washed sandstones of varying mean grain-size (medium and fine modes, respectively). The downhole disappearance of Group 1, and dominance of Groups 3 and 4 reflect a concomitant change from mudrock- and diamictite-rich lithology to a section dominated by clean, well-washed sandstones with minor conglomerates. Progressive downhole increases in percentage sand and principal mode also reflect these changes. Significant shifts in grain-size parameters and entropy group membership were noted across sequence boundaries and seismic reflectors, as recognised in others studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. The diagnostic value of tests for antimyeloperoxidase antibodies (anti-MPO) for systemic vasculitis is less established than that for cytoplasmic antineutrophil cytoplasmic antibody (cANCA)/antiproteinase 3 antibodies (anti-PR3). Controversy exists regarding the optimal utilization of indirect immunofluorescence (IIF) ANCA testing versus antigen-specific ANCA testing. To summarize the pertinent data, we conducted a metaanalysis examining the diagnostic value of ANCA testing systems that include assays for anti-MPO. Methods. We performed a structured Medline search and reference list review. Target articles in the search strategy were those reporting the diagnostic value of immunoassays for anti-MPO for the spectrum of systemic necrotizing vasculitides that includes Wegener's granulomatosis, microscopic polyangiitis, the Churg-Strauss syndrome, and isolated pauci-immune necrotizing or crescentic glomerulonephritis, regardless of other types of ANCA tests. Inclusion criteria required specification of a consecutive or random patient selection method and the use of acceptable criteria for the diagnosis of vasculitis exclusive of ANCA test results. Weighted pooled summary estimates of sensitivity and specificity were calculated for anti-MPO alone, anti-MPO + perinuclear ANCA (pANCA), and anti-MPO/pANCA + anti-PR3/cANCA. Results. Of 457 articles reviewed, only 7 met the selection criteria. Summary estimates of sensitivity and specificity (against disease controls only) of assays for anti-MPO for the diagnosis of systemic necrotizing vasculitides were 37.1% (confidence interval 26.6% to 47.6%) and 96.3% (CI 94.1% to 98.5%), respectively. When the pANCA pattern by IIF was combined with anti-MPO testing, the specificity improved to 99.4%, with a lower sensitivity, 31.5%. The combined ANCA testing system (anti-PR3/cANCA + anti-MPO/pANCA) increased the sensitivity to 85.5% with a specificity of 98.6%. Conclusion. These results suggest that while anti-MPO is relatively specific for the diagnosis of systemic vasculitis, the combination system of immunoassays for anti-MPO and IIF for pANCA is highly specific and both tests should be used together given the high diagnostic precision required for these conditions. Because patients with ANCA associated vasculitis have either anti-MPO with pANCA or anti-PR3 with cANCA, and rarely both, a combined ANCA testing system including anti-PR3/cANCA and anti-MPO/pANCA is recommended to optimize the diagnostic performance of ANCA testing. (J Rheumatol 2001;28:1584-90)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using NONMEM, the population pharmacokinetics of perhexiline were studied in 88 patients (34 F, 54 M) who were being treated for refractory angina. Their mean +/- SD (range) age was 75 +/- 9.9 years (46-92), and the length of perhexiline treatment was 56 +/- 77 weeks (0.3-416). The sampling time after a dose was 14.1 +/- 21.4 hours (0.5-200), and the perhexiline plasma concentrations were 0.39 +/- 0.32 mg/L (0.03-1.56). A one-compartment model with first-order absorption was fitted to the data using the first-order (FO) approximation. The best model contained 2 subpopulations (obtained via the $MIXTURE subroutine) of 77 subjects (subgroup A) and 11 subjects (subgroup B) that had typical values for clearance (CL/F) of 21.8 L/h and 2.06 L/h, respectively. The volumes of distribution (V/F) were 1470 L and 260 L, respectively, which suggested a reduction in presystemic metabolism in subgroup B. The interindividual variability (CV%) was modeled logarithmically and for CL/F ranged from 69.1% (subgroup A) to 86.3% (subgroup B). The interindividual variability in V/F was 111%. The residual variability unexplained by the population model was 28.2%. These results confirm and extend the existing pharmacokinetic data on perhexiline, especially the bimodal distribution of CL/F manifested via an inherited deficiency in hepatic and extrahepatic CYP2D6 activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When the data consist of certain attributes measured on the same set of items in different situations, they would be described as a three-mode three-way array. A mixture likelihood approach can be implemented to cluster the items (i.e., one of the modes) on the basis of both of the other modes simultaneously (i.e,, the attributes measured in different situations). In this paper, it is shown that this approach can be extended to handle three-mode three-way arrays where some of the data values are missing at random in the sense of Little and Rubin (1987). The methodology is illustrated by clustering the genotypes in a three-way soybean data set where various attributes were measured on genotypes grown in several environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional planners, policy makers and policing agencies all recognize the importance of better understanding the dynamics of crime. Theoretical and application-oriented approaches which provide insights into why and where crimes take place are much sought after. Geographic information systems and spatial analysis techniques, in particular, are proving to be essential or studying criminal activity. However, the capabilities of these quantitative methods continue to evolve. This paper explores the use of geographic information systems and spatial analysis approaches for examining crime occurrence in Brisbane, Australia. The analysis highlights novel capabilities for the analysis of crime in urban regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cost and risk associated with mineral exploration in Australia increases significantly as companies move into deeper regolith-covered terrain. The ability to map the bedrock and the depth of weathering within an area has the potential to decrease this risk and increase the effectiveness of exploration programs. This paper is the second in a trilogy concerning the Grant's Patch area of the Eastern Goldfields. The recent development of the VPmg potential field inversion program in conjunction with the acquisition of high-resolution gravity data over an area with extensive drilling provided an opportunity to evaluate three-dimensional gravity inversion as a bedrock and regolith mapping tool. An apparent density model of the study area was constructed, with the ground represented as adjoining 200 m by 200 m vertical rectangular prisms. During inversion VPmg incrementally adjusted the density of each prism until the free-air gravity response of the model replicated the observed data. For the Grant's Patch study area, this image of the apparent density values proved easier to interpret than the Bouguer gravity image. A regolith layer was introduced into the model and realistic fresh-rock densities assigned to each basement prism according to its interpreted lithology. With the basement and regolith densities fixed, the VPmg inversion algorithm adjusted the depth to fresh basement until the misfit between the calculated and observed gravity response was minimised. The resulting geometry of the bedrock/regolith contact largely replicated the base of weathering indicated by drilling with predicted depth of weathering values from gravity inversion typically within 15% of those logged during RAB and RC drilling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The magnitude of genotype-by-management (G x M) interactions for grain yield and grain protein concentration was examined in a multi-environment trial (MET) involving a diverse set of 272 advanced breeding lines from the Queensland wheat breeding program. The MET was structured as a series of management-regimes imposed at 3 sites for 2 years. The management-regimes were generated at each site-year as separate trials in which planting time, N fertiliser application rate, cropping history, and irrigation were manipulated. irrigation was used to simulate different rainfall regimes. From the combined analysis of variance, the G x M interaction variance components were found to be the largest source of G x E interaction variation for both grain yield (0.117 +/- 0.005 t(2) ha(-2); 49% of total G x E 0.238 +/- 0.028 t(2) ha(-2)) and grain protein concentration (0.445 +/- 0.020%(2); 82% of total G x E 0.546 +/- 0.057%(2)), and in both cases this source of variation was larger than the genotypic variance component (grain yield 0.068 +/- 0.014 t(2) ha(-2) and grain protein 0.203 +/- 0.026%(2)). The genotypic correlation between the traits varied considerably with management-regime, ranging from -0.98 to -0.31, with an estimate of 0.0 for one trial. Pattern analysis identified advanced breeding lines with improved grain yield and grain protein concentration relative to the cultivars Hartog, Sunco and Meteor. It is likely that a large component of the previously documented G x E interactions for grain yield of wheat in the northern grains region are in part a result of G x M interactions. The implications of the strong influence of G x M interactions for the conduct of wheat breeding METs in the northern region are discussed. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Performance indicators in the public sector have often been criticised for being inadequate and not conducive to analysing efficiency. The main objective of this study is to use data envelopment analysis (DEA) to examine the relative efficiency of Australian universities. Three performance models are developed, namely, overall performance, performance on delivery of educational services, and performance on fee-paying enrolments. The findings based on 1995 data show that the university sector was performing well on technical and scale efficiency but there was room for improving performance on fee-paying enrolments. There were also small slacks in input utilisation. More universities were operating at decreasing returns to scale, indicating a potential to downsize. DEA helps in identifying the reference sets for inefficient institutions and objectively determines productivity improvements. As such, it can be a valuable benchmarking tool for educational administrators and assist in more efficient allocation of scarce resources. In the absence of market mechanisms to price educational outputs, which renders traditional production or cost functions inappropriate, universities are particularly obliged to seek alternative efficiency analysis methods such as DEA.