891 resultados para sampling error


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study uses several measures derived from the error matrix for comparing two thematic maps generated with the same sample set. The reference map was generated with all the sample elements and the map set as the model was generated without the two points detected as influential by the analysis of local influence diagnostics. The data analyzed refer to the wheat productivity in an agricultural area of 13.55 ha considering a sampling grid of 50 x 50 m comprising 50 georeferenced sample elements. The comparison measures derived from the error matrix indicated that despite some similarity on the maps, they are different. The difference between the estimated production by the reference map and the actual production was of 350 kilograms. The same difference calculated with the mode map was of 50 kilograms, indicating that the study of influential points is of fundamental importance to obtain a more reliable estimative and use of measures obtained from the error matrix is a good option to make comparisons between thematic maps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking into account that the sampling intensity of soil attributes is a determining factor for applying of concepts of precision agriculture, this study aims to determine the spatial distribution pattern of soil attributes and corn yield at four soil sampling intensities and verify how sampling intensity affects cause-effect relationship between soil attributes and corn yield. A 100-referenced point sample grid was imposed on the experimental site. Thus, each sampling cell encompassed an area of 45 m² and was composed of five 10-m long crop rows, where referenced points were considered the center of the cell. Samples were taken from at 0 to 0.1 m and 0.1 to 0.2 m depths. Soil chemical attributes and clay content were evaluated. Sampling intensities were established by initial 100-point sampling, resulting data sets of 100; 75; 50 and 25 points. The data were submitted to descriptive statistical and geostatistics analyses. The best sampling intensity to know the spatial distribution pattern was dependent on the soil attribute being studied. The attributes P and K+ content showed higher spatial variability; while the clay content, Ca2+, Mg2+ and base saturation values (V) showed lesser spatial variability. The spatial distribution pattern of clay content and V at the 100-point sampling were the ones which best explained the spatial distribution pattern of corn yield.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACT This study aimed to compare thematic maps of soybean yield for different sampling grids, using geostatistical methods (semivariance function and kriging). The analysis was performed with soybean yield data in t ha-1 in a commercial area with regular grids with distances between points of 25x25 m, 50x50 m, 75x75 m, 100x100 m, with 549, 188, 66 and 44 sampling points respectively; and data obtained by yield monitors. Optimized sampling schemes were also generated with the algorithm called Simulated Annealing, using maximization of the overall accuracy measure as a criterion for optimization. The results showed that sample size and sample density influenced the description of the spatial distribution of soybean yield. When the sample size was increased, there was an increased efficiency of thematic maps used to describe the spatial variability of soybean yield (higher values of accuracy indices and lower values for the sum of squared estimation error). In addition, more accurate maps were obtained, especially considering the optimized sample configurations with 188 and 549 sample points.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ABSTRACTObjective:to assess the impact of the shift inlet trauma patients, who underwent surgery, in-hospital mortality.Methods:a retrospective observational cohort study from November 2011 to March 2012, with data collected through electronic medical records. The following variables were statistically analyzed: age, gender, city of origin, marital status, admission to the risk classification (based on the Manchester Protocol), degree of contamination, time / admission round, admission day and hospital outcome.Results:during the study period, 563 patients injured victims underwent surgery, with a mean age of 35.5 years (± 20.7), 422 (75%) were male, with 276 (49.9%) received in the night shift and 205 (36.4%) on weekends. Patients admitted at night and on weekends had higher mortality [19 (6.9%) vs. 6 (2.2%), p=0.014, and 11 (5.4%) vs. 14 (3.9%), p=0.014, respectively]. In the multivariate analysis, independent predictors of mortality were the night admission (OR 3.15), the red risk classification (OR 4.87), and age (OR 1.17).Conclusion:the admission of night shift and weekend patients was associated with more severe and presented higher mortality rate. Admission to the night shift was an independent factor of surgical mortality in trauma patients, along with the red risk classification and age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To obtain the desirable accuracy of a robot, there are two techniques available. The first option would be to make the robot match the nominal mathematic model. In other words, the manufacturing and assembling tolerances of every part would be extremely tight so that all of the various parameters would match the “design” or “nominal” values as closely as possible. This method can satisfy most of the accuracy requirements, but the cost would increase dramatically as the accuracy requirement increases. Alternatively, a more cost-effective solution is to build a manipulator with relaxed manufacturing and assembling tolerances. By modifying the mathematical model in the controller, the actual errors of the robot can be compensated. This is the essence of robot calibration. Simply put, robot calibration is the process of defining an appropriate error model and then identifying the various parameter errors that make the error model match the robot as closely as possible. This work focuses on kinematic calibration of a 10 degree-of-freedom (DOF) redundant serial-parallel hybrid robot. The robot consists of a 4-DOF serial mechanism and a 6-DOF hexapod parallel manipulator. The redundant 4-DOF serial structure is used to enlarge workspace and the 6-DOF hexapod manipulator is used to provide high load capabilities and stiffness for the whole structure. The main objective of the study is to develop a suitable calibration method to improve the accuracy of the redundant serial-parallel hybrid robot. To this end, a Denavit–Hartenberg (DH) hybrid error model and a Product-of-Exponential (POE) error model are developed for error modeling of the proposed robot. Furthermore, two kinds of global optimization methods, i.e. the differential-evolution (DE) algorithm and the Markov Chain Monte Carlo (MCMC) algorithm, are employed to identify the parameter errors of the derived error model. A measurement method based on a 3-2-1 wire-based pose estimation system is proposed and implemented in a Solidworks environment to simulate the real experimental validations. Numerical simulations and Solidworks prototype-model validations are carried out on the hybrid robot to verify the effectiveness, accuracy and robustness of the calibration algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pulse Response Based Control (PRBC) is a recently developed minimum time control method for flexible structures. The flexible behavior of the structure is represented through a set of discrete time sequences, which are the responses of the structure due to rectangular force pulses. The rectangular force pulses are given by the actuators that control the structure. The set of pulse responses, desired outputs, and force bounds form a numerical optimization problem. The solution of the optimization problem is a minimum time piecewise constant control sequence for driving the system to a desired final state. The method was developed for driving positive semi-definite systems. In case the system is positive definite, some final states of the system may not be reachable. Necessary conditions for reachability of the final states are derived for systems with a finite number of degrees of freedom. Numerical results are presented that confirm the derived analytical conditions. Numerical simulations of maneuvers of distributed parameter systems have shown a relationship between the error in the estimated minimum control time and sampling interval

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article deals with a contour error controller (CEC) applied in a high speed biaxial table. It works simultaneously with the table axes controllers, helping them. In the early stages of the investigation, it was observed that its main problem is imprecision when tracking non-linear contours at high speeds. The objectives of this work are to show that this problem is caused by the lack of exactness of the contour error mathematical model and to propose modifications in it. An additional term is included, resulting in a more accurate value of the contour error, enabling the use of this type of motion controller at higher feedrate. The response results from simulated and experimental tests are compared with those of common PID and non-corrected CEC in order to analyse the effectiveness of this controller over the system. The main conclusions are that the proposed contour error mathematical model is simple, accurate, almost insensible to the feedrate and that a 20:1 reduction of the integral absolute contour error is possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Julkaisumaa: 158 TW TWN Taiwan

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aimed to develop allometric equations for tree biomass estimation, and to determine the site biomass in different "cerrado" ecosystems. Destructive sampling in a "campo cerrado" (open savanna) was carried out at the Biological Reserve of Moji-Guaçu, State of São Paulo, southeastern Brazil. This "campo cerrado" (open savanna) grows under a tropical climate and on acid, low nutrient soils. Sixty wood plants were cut to ground level and measurements of diameter, height and weight of leaves and stems were taken. We selected the best equations among the most commonly used mathematical relations according to R² values, significance, and standard error. Both diameter (D) and height (H) showed good relationship with plant biomass, but the use of these two parameters together (DH and D²H) provided the best predictor variables. The best equations were linear, but power and exponential equations also showed high R² and significance. The applicability of these equations is discussed and biomass estimates are compared with other types of tropical savannas. Mineralmass was also estimated. "Cerrados" proved to have very important carbon reservoirs due to their great extent. In addition, high land-use change that takes place nowadays in the "cerrado" biome may significantly affect the global carbon cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioanalytical data from a bioequivalence study were used to develop limited-sampling strategy (LSS) models for estimating the area under the plasma concentration versus time curve (AUC) and the peak plasma concentration (Cmax) of 4-methylaminoantipyrine (MAA), an active metabolite of dipyrone. Twelve healthy adult male volunteers received single 600 mg oral doses of dipyrone in two formulations at a 7-day interval in a randomized, crossover protocol. Plasma concentrations of MAA (N = 336), measured by HPLC, were used to develop LSS models. Linear regression analysis and a "jack-knife" validation procedure revealed that the AUC0-¥ and the Cmax of MAA can be accurately predicted (R²>0.95, bias <1.5%, precision between 3.1 and 8.3%) by LSS models based on two sampling times. Validation tests indicate that the most informative 2-point LSS models developed for one formulation provide good estimates (R²>0.85) of the AUC0-¥ or Cmax for the other formulation. LSS models based on three sampling points (1.5, 4 and 24 h), but using different coefficients for AUC0-¥ and Cmax, predicted the individual values of both parameters for the enrolled volunteers (R²>0.88, bias = -0.65 and -0.37%, precision = 4.3 and 7.4%) as well as for plasma concentration data sets generated by simulation (R²>0.88, bias = -1.9 and 8.5%, precision = 5.2 and 8.7%). Bioequivalence assessment of the dipyrone formulations based on the 90% confidence interval of log-transformed AUC0-¥ and Cmax provided similar results when either the best-estimated or the LSS-derived metrics were used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"La Niora" is a red pepper variety cultivated in Tadla Region (Morocco) which is used for manufacturing paprika after sun drying. The paprika quality (nutritional, chemical and microbiological) was evaluated immediately after milling, from September to December. Sampling time mainly affected paprika color and the total capsaicinoid and vitamin C contents. The commercial quality was acceptable and no aflatoxins were found, but the microbial load sometimes exceeded permitted levels.