985 resultados para partial discharge measurement
Resumo:
From 1974 to 1982 repeated tracer tests using fluorescent dyes were carried out in the highly glaciated drainage basin of Vernagtbach. These tests enabled the quantitative determination of the runoff in the forefield of the Vernagtferner, the calculation of travel times of the stream water and estimations of the relative contributions to the entire runoff originating from individual streams. In addition, tracer tests were carried out in the firn area of the glacier resulting in data concerning the storage and travel time of meltwater inside the glacier.
Resumo:
When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.
Resumo:
The growing interest in quantifying the cultural and creative industries, visualize the economic contribution of activities related to culture demands first of all the construction of internationally comparable analysis frameworks. Currently there are three major bodies which address this issue and whose comparative study is the focus of this article: the UNESCO Framework for Cultural Statistics (FCS-2009), the European Framework for Cultural Statistics (ESSnet-Culture 2012) and the methodological resource of the “Convenio Andrés Bello” group for working with the Satellite Accounts on Culture in Ibero-America (CAB-2015). Cultural sector measurements provide the information necessary for correct planning of cultural policies which in turn leads to sustaining industries and promoting cultural diversity. The text identifies the existing differences in the three models and three levels of analysis, the sectors, the cultural activities and the criteria that each one uses in order to determine the distribution of the activities by sector. The end result leaves the impossibility of comparing cultural statistics of countries that implement different frameworks.
Resumo:
The widespread efforts to incorporate the economic values of oceans into national income accounts have reached a stage where coordination of national efforts is desirable. A symposium held in 2015 began this process by bringing together representatives from ten countries. The symposium concluded that a definition of core ocean industries was possible but beyond that core the definition of ocean industries is in flux. Better coordination of ocean income accounts will require addressing issues of aggregation, geography, partial ocean industries, confidential, and imputation is also needed. Beyond the standard national income accounts, a need to incorporate environmental resource and ecosystem service values to gain a complete picture of the economic role of the oceans was identified. The U.N. System of Environmental and Economic Accounts and the Experimental Ecosystem Service Accounts provide frameworks for this expansion. This will require the development of physical accounts of environmental assets linked to the economic accounts as well as the adaptation of transaction and welfare based economic valuation methods to environmental resources and ecosystem services. The future development of ocean economic data is most likely to require cooperative efforts at development of metadata standards and the use of multiple platforms of opportunity created by policy analysis, economic development, and conservation projects to both collect new economic data and to sustain ocean economy data collection into the future by building capacity in economic data collection and use..
Resumo:
The partial collapse of a building in Colombia caused severe damage to its structural components -- An implosion was realized to induce the collapse of 50% of the deteriorated building -- To evaluate the influence of the implosion on the remaining structure, a monitoring survey was realized using triaxial accelerometers -- Time signals associated with ambient, seismic and forced vibration were obtained -- A study of the records in the time and the frequency domain was made -- The analysis of the information allowed determining some structural properties that were useful to calibrate the analytical model of the structure
Resumo:
An experimental and numerical study of turbulent fire suppression is presented. For this work, a novel and canonical facility has been developed, featuring a buoyant, turbulent, methane or propane-fueled diffusion flame suppressed via either nitrogen dilution of the oxidizer or application of a fine water mist. Flames are stabilized on a slot burner surrounded by a co-flowing oxidizer, which allows controlled delivery of either suppressant to achieve a range of conditions from complete combustion through partial and total flame quenching. A minimal supply of pure oxygen is optionally applied along the burner to provide a strengthened flame base that resists liftoff extinction and permits the study of substantially weakened turbulent flames. The carefully designed facility features well-characterized inlet and boundary conditions that are especially amenable to numerical simulation. Non-intrusive diagnostics provide detailed measurements of suppression behavior, yielding insight into the governing suppression processes, and aiding the development and validation of advanced suppression models. Diagnostics include oxidizer composition analysis to determine suppression potential, flame imaging to quantify visible flame structure, luminous and radiative emissions measurements to assess sooting propensity and heat losses, and species-based calorimetry to evaluate global heat release and combustion efficiency. The studied flames experience notable suppression effects, including transition in color from bright yellow to dim blue, expansion in flame height and structural intermittency, and reduction in radiative heat emissions. Still, measurements indicate that the combustion efficiency remains close to unity, and only near the extinction limit do the flames experience an abrupt transition from nearly complete combustion to total extinguishment. Measurements are compared with large eddy simulation results obtained using the Fire Dynamics Simulator, an open-source computational fluid dynamics software package. Comparisons of experimental and simulated results are used to evaluate the performance of available models in predicting fire suppression. Simulations in the present configuration highlight the issue of spurious reignition that is permitted by the classical eddy-dissipation concept for modeling turbulent combustion. To address this issue, simple treatments to prevent spurious reignition are developed and implemented. Simulations incorporating these treatments are shown to produce excellent agreement with the experimentally measured data, including the global combustion efficiency.
Resumo:
Background: Post-discharge mortality is a frequent but poorly recognized contributor to child mortality in resource limited countries. The identification of children at high risk for post-discharge mortality is a critically important first step in addressing this problem. Objectives: The objective of this project was to determine the variables most likely to be associated with post-discharge mortality which are to be included in a prediction modelling study. Methods: A two-round modified Delphi process was completed for the review of a priori selected variables and selection of new variables. Variables were evaluated on relevance according to (1) prediction (2) availability (3) cost and (4) time required for measurement. Participants included experts in a variety of relevant fields. Results: During the first round of the modified Delphi process, 23 experts evaluated 17 variables. Forty further variables were suggested and were reviewed during the second round by 12 experts. During the second round 16 additional variables were evaluated. Thirty unique variables were compiled for use in the prediction modelling study. Conclusion: A systematic approach was utilized to generate an optimal list of candidate predictor variables for the incorporation into a study on prediction of pediatric post-discharge mortality in a resource poor setting.
Resumo:
Fluvial sediment transport is controlled by hydraulics, sediment properties and arrangement, and flow history across a range of time scales. This physical complexity has led to ambiguous definition of the reference frame (Lagrangian or Eulerian) in which sediment transport is analysed. A general Eulerian-Lagrangian approach accounts for inertial characteristics of particles in a Lagrangian (particle fixed) frame, and for the hydrodynamics in an independent Eulerian frame. The necessary Eulerian-Lagrangian transformations are simplified under the assumption of an ideal Inertial Measurement Unit (IMU), rigidly attached at the centre of the mass of a sediment particle. Real, commercially available IMU sensors can provide high frequency data on accelerations and angular velocities (hence forces and energy) experienced by grains during entrainment and motion, if adequately customized. IMUs are subjected to significant error accu- mulation but they can be used for statistical parametrisation of an Eulerian-Lagrangian model, for coarse sediment particles and over the temporal scale of individual entrainment events. In this thesis an Eulerian-Lagrangian model is introduced and evaluated experimentally. Absolute inertial accelerations were recorded at a 4 Hz frequency from a spherical instrumented particle (111 mm diameter and 2383 kg/m3 density) in a series of entrainment threshold experiments on a fixed idealised bed. The grain-top inertial acceleration entrainment threshold was approximated at 44 and 51 mg for slopes 0.026 and 0.037 respectively. The saddle inertial acceleration entrainment threshold was at 32 and 25 mg for slopes 0.044 and 0.057 respectively. For the evaluation of the complete Eulerian-Lagrangian model two prototype sensors are presented: an idealised (spherical) with a diameter of 90 mm and an ellipsoidal with axes 100, 70 and 30 mm. Both are instrumented with a complete IMU, capable of sampling 3D inertial accelerations and 3D angular velocities at 50 Hz. After signal analysis, the results can be used to parametrize sediment movement but they do not contain positional information. The two sensors (spherical and ellipsoidal) were tested in a series of entrainment experiments, similar to the evaluation of the 111 mm prototype, for a slope of 0.02. The spherical sensor entrained at discharges of 24.8 ± 1.8 l/s while the same threshold for the ellipsoidal sensor was 45.2 ± 2.2 l/s. Kinetic energy calculations were used to quantify the particle-bed energy exchange under fluvial (discharge at 30 l/s) and non-fluvial conditions. All the experiments suggest that the effect of the inertial characteristics of coarse sediments on their motion is comparable to the effect hydrodynamic forces. The coupling of IMU sensors with advanced telemetric systems can lead to the tracking of Lagrangian particle trajectories, at a frequency and accuracy that will permit the testing of diffusion/dispersion models across the range of particle diameters.
Resumo:
A range of influences, technical and organizational, has encouraged the wide spread adaption of Enterprise Systems (ES). Nevertheless, there is a growing consensus that Enterprise Systems have in the many cases failed to provide the expected benefits to organizations. This paper presents ongoing research, which analyzes the benefits realization approach of the Queensland Government. This approach applies a modified Balance Scorecard. First, history and background of Queensland Government’s Enterprise Systems initiative is introduced. Second, the most common reasons for ES under performance are related. Third, relevant performance measurement models and the Balanced Scorecard in particular are discussed. Finally, the Queensland Government initiative is evaluated in light of this overview of current work in the area. In the current and future work, the authors aim to use their active involvement in Queensland Government’s benefits realization initiative for an Action Research based project investigating the appropriateness of the Balanced Scorecard for the purposes of Enterprise Systems benefits realization.