942 resultados para Variable Sampling Interval Control Charts
Resumo:
Esta investigación constituye un acercamiento a los estudios de la ciudad desde el control de los individuos. A propósito del proceso de modernización agenciado en Cartagena entre 1903 y 1927, se realiza una aproximación a las formas con las que se definió, ubicó y trató de disciplinar a los sujetos vagos. De allí que se preste mayor atención a las dinámicas de la modernización de la ciudad en cuanto a los aspectos que implicaron el gobierno de la población. En el primer capítulo, distanciándonos de la visión de la historiografía inicial que, sobre los hechos modernizantes se produjeron en torno a los resultados de las acciones del ex - presidente Rafael Nuñez, fundamentalmente se da cuenta de la conformación del régimen de vigilancia que se inauguró para garantizar la sanidad del puerto. En este apartado centramos la atención -más allá de las modificaciones espaciales experimentadas en la ciudad-, en la creación del cuerpo de policía sanitaria, marítima y terrestre: el cuerpo de vigilancia y control encargado de suprimir el desorden de esta urbe para desde ahí, situarnos, en un primer momento, en el control del alcoholismo y la prostitución. El segundo capítulo hace especial énfasis en una variable que se desplegó en los intentos de modernización de la ciudad: se trata de un tipo de higienismo práctico que denominamos higienismo social. Aquí se centra la atención en la definición, ubicación y aproximación numérica sobre los sujetos vagos. Finalmente, en el tercer capítulo, se presentan las acciones disciplinantes que implicó el proyecto de ciudad moderna. En esta parte, se presta especial importancia, tanto a los dispositivos de policía como a las medidas nacionales de confinamiento a las colonias penales y agrícolas que se implementaron para hacer de los sujetos vagos seres productivos y funcionales al orden moderno pretendido. Se concluye con los problemas que se presentaron para concretar este proyecto.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
It has been generally accepted that the method of moments (MoM) variogram, which has been widely applied in soil science, requires about 100 sites at an appropriate interval apart to describe the variation adequately. This sample size is often larger than can be afforded for soil surveys of agricultural fields or contaminated sites. Furthermore, it might be a much larger sample size than is needed where the scale of variation is large. A possible alternative in such situations is the residual maximum likelihood (REML) variogram because fewer data appear to be required. The REML method is parametric and is considered reliable where there is trend in the data because it is based on generalized increments that filter trend out and only the covariance parameters are estimated. Previous research has suggested that fewer data are needed to compute a reliable variogram using a maximum likelihood approach such as REML, however, the results can vary according to the nature of the spatial variation. There remain issues to examine: how many fewer data can be used, how should the sampling sites be distributed over the site of interest, and how do different degrees of spatial variation affect the data requirements? The soil of four field sites of different size, physiography, parent material and soil type was sampled intensively, and MoM and REML variograms were calculated for clay content. The data were then sub-sampled to give different sample sizes and distributions of sites and the variograms were computed again. The model parameters for the sets of variograms for each site were used for cross-validation. Predictions based on REML variograms were generally more accurate than those from MoM variograms with fewer than 100 sampling sites. A sample size of around 50 sites at an appropriate distance apart, possibly determined from variograms of ancillary data, appears adequate to compute REML variograms for kriging soil properties for precision agriculture and contaminated sites. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Long-term monitoring of forest soils as part of a pan-European network to detect environmental change depends on an accurate determination of the mean of the soil properties at each monitoring event. Forest soil is known to be very variable spatially, however. A study was undertaken to explore and quantify this variability at three forest monitoring plots in Britain. Detailed soil sampling was carried out, and the data from the chemical analyses were analysed by classical statistics and geostatistics. An analysis of variance showed that there were no consistent effects from the sample sites in relation to the position of the trees. The variogram analysis showed that there was spatial dependence at each site for several variables and some varied in an apparently periodic way. An optimal sampling analysis based on the multivariate variogram for each site suggested that a bulked sample from 36 cores would reduce error to an acceptable level. Future sampling should be designed so that it neither targets nor avoids trees and disturbed ground. This can be achieved best by using a stratified random sampling design.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Asynchronous Optical Sampling (ASOPS) [1,2] and frequency comb spectrometry [3] based on dual Ti:saphire resonators operated in a master/slave mode have the potential to improve signal to noise ratio in THz transient and IR sperctrometry. The multimode Brownian oscillator time-domain response function described by state-space models is a mathematically robust framework that can be used to describe the dispersive phenomena governed by Lorentzian, Debye and Drude responses. In addition, the optical properties of an arbitrary medium can be expressed as a linear combination of simple multimode Brownian oscillator functions. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing the recorded THz transients in the time or frequency domain will be outlined [4,5]. Since a femtosecond duration pulse is capable of persistent excitation of the medium within which it propagates, such approach is perfectly justifiable. Several de-noising routines based on system identification will be shown. Furthermore, specifically developed apodization structures will be discussed. These are necessary because due to dispersion issues, the time-domain background and sample interferograms are non-symmetrical [6-8]. These procedures can lead to a more precise estimation of the complex insertion loss function. The algorithms are applicable to femtosecond spectroscopies across the EM spectrum. Finally, a methodology for femtosecond pulse shaping using genetic algorithms aiming to map and control molecular relaxation processes will be mentioned.
Resumo:
Twenty-eight field experiments on sandy-loam soils in the UK (1982-2003) are reviewed by relating the extension of the green area duration of the flag leaf (GLADF) by fungicides to effects on yield and quality of winter wheat. Over all experiments mean grain yield = 8.85t ha(-1) at 85% DM. With regards quality, mean values were: thousand grain weight (TGW) = 44.5 g; specific weight (SWT) = 76.9 kg hl(-1); crude protein concentration (CP (N x 5.7)) = 12.5 % DM; Hagberg falling number (HFN) = 285 s; and sodium dodecyl sulphate (SDS)-sedimentation volume = 69ml. For each day (d) that fungicides increased GLADF there were associated average increases in yield (0.144 1 ha(-1) d(-1), se 0.0049, df = 333), TGW (0.56 gd(-1), se = 0.017) and SWT (0.22 kg hl(-1) d(-1), se 0.011). Some curvature was evident in all these relationships. When GLADF was delayed beyond 700 degrees Cd after anthesis, as was possible in cool wet seasons, responses were curtailed, or less reliable. Despite this apparent terminal sink limitation, fungicide effects on sink size, eg endosperm cell numbers or maximum water mass per grain, were not prerequisites for large effects on grain yield, TGW or SWT. Fungicide effects on CP were variable. Although the average response of CP was negative (-0.029%DM/d; se = 0.00338), this depended on cultivar and disease controlled. Controlling biotrophs such as rusts, (Puccinia spp.) tended to increase CP, whereas controlling a more necrotrophic pathogen (Septoria tritici) usually reducedCP. Irrespective of pathogen controlled, delaying senescence of the flag leaf was associated with increased nitrogen yields in the grain (averaging 2.24 kg N ha-1 d(-1), se = 0.0848) due to both increased N uptake into the above ground crop, and also more efficient remobilisation of N from leaf laminas. When sulphur availability appeared to be adequate, fungicide x cultivar interactions were similar on S as for CP, although N:S ratios tended to decline (i.e. improve for bread making) when S. tritici was controlled. On average, SDS-sedimentation volume declined (-0. 18 ml/d, se = 0.027) with increased GLADF, broadly commensurate with the average effect on CP. Hagberg falling number decreased as fungicide increased GLADF (-2.73 s/d, se = 0.178), indicating an increase in alpha-amylase activity.
Resumo:
Objective: To determine the risk of lung cancer associated with exposure at home to the radioactive disintegration products of naturally Occurring radon gas. Design: Collaborative analysis of individual data from 13 case-control studies of residential radon and lung cancer. Setting Nine European countries. Subjects 7148 cases Of lung cancer and 14 208 controls. Main outcome measures: Relative risks of lung cancer and radon gas concentrations in homes inhabited during the previous 5-34 years measured in becquerels (radon disintegrations per second) per cubic incite (Bq/m(3)) Of household air. Results: The mean measured radon concentration in homes of people in tire control group was 97 Bq/m(3), with 11% measuring > 200 and 4% measuring > 400 Bq/m(3). For cases of lung cancer the mean concentration was 104 Bq/m(3). The risk of lung cancer increased by 8.4% (95% confidence interval 3.0% to 15.8%) per 100 Bq/m(3) increase in measured radon (P = 0.0007). This corresponds to an increase of 16% (5% to 31%) per 100 Bq/m(3) increase in usual radon-that is, after correction for the dilution caused by random uncertainties in measuring radon concentrations. The dose-response relation seemed to be linear with no threshold and remained significant (P=0.04) in analyses limited to individuals from homes with measured radon < 200 Bq/m(3). The proportionate excess risk did not differ significantly with study, age, sex, or smoking. In the absence of other causes of death, the absolute risks of lung cancer by age 75 years at usual radon concentrations of 0, 100, and 400 Bq/m(3) would be about 0.4%, 0.5%, and 0.7%, respectively, for lifelong non-smokers, and about 25 times greater (10%, 12%, and 16%) for cigarette smokers. Conclusions: Collectively, though not separately, these studies show appreciable hazards from residential radon, particularly for smokers and recent ex-smokers, and indicate that it is responsible for about 2% of all deaths from cancer in Europe.
Resumo:
An adaptive tuned vibration absorber (ATVA) with a smart variable stiffness element is capable of retuning itself in response to a time-varying excitation frequency., enabling effective vibration control over a range of frequencies. This paper discusses novel methods of achieving variable stiffness in an ATVA by changing shape, as inspired by biological paradigms. It is shown that considerable variation in the tuned frequency can be achieved by actuating a shape change, provided that this is within the limits of the actuator. A feasible design for such an ATVA is one in which the device offers low resistance to the required shape change actuation while not being restricted to low values of the effective stiffness of the vibration absorber. Three such original designs are identified: (i) A pinned-pinned arch beam with fixed profile of slight curvature and variable preload through an adjustable natural curvature; (ii) a vibration absorber with a stiffness element formed from parallel curved beams of adjustable curvature vibrating longitudinally; (iii) a vibration absorber with a variable geometry linkage as stiffness element. The experimental results from demonstrators based on two of these designs show good correlation with the theory.
Resumo:
Eye-movements have long been considered a problem when trying to understand the visual control of locomotion. They transform the retinal image from a simple expanding pattern of moving texture elements (pure optic flow), into a complex combination of translation and rotation components (retinal flow). In this article we investigate whether there are measurable advantages to having an active free gaze, over a static gaze or tracking gaze, when steering along a winding path. We also examine patterns of free gaze behavior to determine preferred gaze strategies during active locomotion. Participants were asked to steer along a computer-simulated textured roadway with free gaze, fixed gaze, or gaze tracking the center of the roadway. Deviation of position from the center of the road was recorded along with their point of gaze. It was found that visually tracking the middle of the road produced smaller steering errors than for fixed gaze. Participants performed best at the steering task when allowed to sample naturally from the road ahead with free gaze. There was some variation in the gaze strategies used, but sampling was predominantly of areas proximal to the center of the road. These results diverge from traditional models of flow analysis.
Resumo:
This paper describes the SIMULINK implementation of a constrained predictive control algorithm based on quadratic programming and linear state space models, and its application to a laboratory-scale 3D crane system. The algorithm is compatible with Real Time. Windows Target and, in the case of the crane system, it can be executed with a sampling period of 0.01 s and a prediction horizon of up to 300 samples, using a linear state space model with 3 inputs, 5 outputs and 13 states.
Resumo:
National food control systems are a key element in the protection of consumers from unsafe foods and from other fraudulent practices. International guidance is available and provides a framework for enhancing national systems. However, it is recognized that before reaching decisions on the necessary improvements to a national system, an analysis is required of the current state of key elements in the present system. This paper provides such an analysis for the State of Kuwait. The fragmented nature of the food control system is described. Four key elements of the Kuwaiti system are analyzed: the legal framework, the administrative structures, the enforcement activity and the provision of education and training. It is noted that the country has a dependence on imported foods and that the present national food control system is largely based on an historic approach to food sampling at the point of import and is unsustainable. The paper recommends a more coordinated approach to food safety control in Kuwait with a significant increase in the use of risk analysis methods to target enforcement.
Resumo:
An input variable selection procedure is introduced for the identification and construction of multi-input multi-output (MIMO) neurofuzzy operating point dependent models. The algorithm is an extension of a forward modified Gram-Schmidt orthogonal least squares procedure for a linear model structure which is modified to accommodate nonlinear system modeling by incorporating piecewise locally linear model fitting. The proposed input nodes selection procedure effectively tackles the problem of the curse of dimensionality associated with lattice-based modeling algorithms such as radial basis function neurofuzzy networks, enabling the resulting neurofuzzy operating point dependent model to be widely applied in control and estimation. Some numerical examples are given to demonstrate the effectiveness of the proposed construction algorithm.
Resumo:
The purpose of this paper is to design a control law for continuous systems with Boolean inputs allowing the output to track a desired trajectory. Such systems are controlled by items of commutation. This type of systems, with Boolean inputs, has found increasing use in the electric industry. Power supplies include such systems and a power converter represents one of theses systems. For instance, in power electronics the control variable is the switching OFF and ON of components such as thyristors or transistors. In this paper, a method is proposed for the designing of a control law in state space for such systems. This approach is implemented in simulation for the control of an electronic circuit.
Resumo:
A technique is derived for solving a non-linear optimal control problem by iterating on a sequence of simplified problems in linear quadratic form. The technique is designed to achieve the correct solution of the original non-linear optimal control problem in spite of these simplifications. A mixed approach with a discrete performance index and continuous state variable system description is used as the basis of the design, and it is shown how the global problem can be decomposed into local sub-system problems and a co-ordinator within a hierarchical framework. An analysis of the optimality and convergence properties of the algorithm is presented and the effectiveness of the technique is demonstrated using a simulation example with a non-separable performance index.