991 resultados para sampling cost


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The objective of this work was to evaluate the use of a low-cost trap to capture Cerambycidae in different seasons in planted forests in Brazil. Thirty polyethylene-terephthalate trap bottles per hectare were used, disposed at every 50 m. The traps were red painted and contained glass flasks with a mixture of ethanol, methanol and benzaldehyde. There were soap and water at the trap bottom. The traps were checked biweekly for beetle presence. Sampling time required one minute per sample, and traps were easy to use. Total sampling cost, including materials and labor, was US$ 13.46 per sample. Six Cerambycidae species were captured along the dry and rainy seasons.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Mobile sensor networks have unique advantages compared with wireless sensor networks. The mobility enables mobile sensors to flexibly reconfigure themselves to meet sensing requirements. In this dissertation, an adaptive sampling method for mobile sensor networks is presented. Based on the consideration of sensing resource constraints, computing abilities, and onboard energy limitations, the adaptive sampling method follows a down sampling scheme, which could reduce the total number of measurements, and lower sampling cost. Compressive sensing is a recently developed down sampling method, using a small number of randomly distributed measurements for signal reconstruction. However, original signals cannot be reconstructed using condensed measurements, as addressed by Shannon Sampling Theory. Measurements have to be processed under a sparse domain, and convex optimization methods should be applied to reconstruct original signals. Restricted isometry property would guarantee signals can be recovered with little information loss. While compressive sensing could effectively lower sampling cost, signal reconstruction is still a great research challenge. Compressive sensing always collects random measurements, whose information amount cannot be determined in prior. If each measurement is optimized as the most informative measurement, the reconstruction performance can perform much better. Based on the above consideration, this dissertation is focusing on an adaptive sampling approach, which could find the most informative measurements in unknown environments and reconstruct original signals. With mobile sensors, measurements are collect sequentially, giving the chance to uniquely optimize each of them. When mobile sensors are about to collect a new measurement from the surrounding environments, existing information is shared among networked sensors so that each sensor would have a global view of the entire environment. Shared information is analyzed under Haar Wavelet domain, under which most nature signals appear sparse, to infer a model of the environments. The most informative measurements can be determined by optimizing model parameters. As a result, all the measurements collected by the mobile sensor network are the most informative measurements given existing information, and a perfect reconstruction would be expected. To present the adaptive sampling method, a series of research issues will be addressed, including measurement evaluation and collection, mobile network establishment, data fusion, sensor motion, signal reconstruction, etc. Two dimensional scalar field will be reconstructed using the method proposed. Both single mobile sensors and mobile sensor networks will be deployed in the environment, and reconstruction performance of both will be compared.In addition, a particular mobile sensor, a quadrotor UAV is developed, so that the adaptive sampling method can be used in three dimensional scenarios.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In soil surveys, several sampling systems can be used to define the most representative sites for sample collection and description of soil profiles. In recent years, the conditioned Latin hypercube sampling system has gained prominence for soil surveys. In Brazil, most of the soil maps are at small scales and in paper format, which hinders their refinement. The objectives of this work include: (i) to compare two sampling systems by conditioned Latin hypercube to map soil classes and soil properties; (II) to retrieve information from a detailed scale soil map of a pilot watershed for its refinement, comparing two data mining tools, and validation of the new soil map; and (III) to create and validate a soil map of a much larger and similar area from the extrapolation of information extracted from the existing soil map. Two sampling systems were created by conditioned Latin hypercube and by the cost-constrained conditioned Latin hypercube. At each prospection place, soil classification and measurement of the A horizon thickness were performed. Maps were generated and validated for each sampling system, comparing the efficiency of these methods. The conditioned Latin hypercube captured greater variability of soils and properties than the cost-constrained conditioned Latin hypercube, despite the former provided greater difficulty in field work. The conditioned Latin hypercube can capture greater soil variability and the cost-constrained conditioned Latin hypercube presents great potential for use in soil surveys, especially in areas of difficult access. From an existing detailed scale soil map of a pilot watershed, topographical information for each soil class was extracted from a Digital Elevation Model and its derivatives, by two data mining tools. Maps were generated using each tool. The more accurate of these tools was used for extrapolation of soil information for a much larger and similar area and the generated map was validated. It was possible to retrieve the existing soil map information and apply it on a larger area containing similar soil forming factors, at much low financial cost. The KnowledgeMiner tool for data mining, and ArcSIE, used to create the soil map, presented better results and enabled the use of existing soil map to extract soil information and its application in similar larger areas at reduced costs, which is especially important in development countries with limited financial resources for such activities, such as Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Total particulate matter (TPM) was passively collected inside two classrooms of each of five elementary schools in Lisbon, Portugal. TPM was collected in polycarbonate filters with a 47 mm diameter, placed inside of uncovered plastic petri dishes. The sampling period was from 19 May to 22 June 2009 (35 days exposure) and the collected TPM masses varied between 0.2 mg and 0.8 mg. The major elements were Ca, Fe, Na, K, and Zn at μg level, while others were at ng level. Pearson′s correlation coefficients above 0.75 (a high degree of correlation) were found between several elements. Soil-related, traffic soil re-suspension and anthropogenic emission sources could be identified. Blackboard chalk was also identified through Ca large presence. Some of the determined chemical elements are potential carcinogenic. Quality control of the results showed good agreement as confirmed by the application of u-score test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most important measures to prevent wild forest fires is the use of prescribed and controlled burning actions as it reduce the fuel mass availability. The impact of these management activities on soil physical and chemical properties varies according to the type of both soil and vegetation. Decisions in forest management plans are often based on the results obtained from soil-monitoring campaigns. Those campaigns are often man-labor intensive and expensive. In this paper we have successfully used the multivariate statistical technique Robust Principal Analysis Compounds (ROBPCA) to investigate on the sampling procedure effectiveness for two different methodologies, in order to reflect on the possibility of simplifying and reduce the sampling collection process and its auxiliary laboratory analysis work towards a cost-effective and competent forest soil characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An improved device for detecting peridomestic Triatoma infestans consisting of one-liter recycled Tetra Brik milk boxes with a central structure was tested using a matched-pair study design in two rural areas in Argentina. In Olta (La Rioja), the boxes were installed beneath the thatched roofs and on the vertical wooden posts of each peridomestic structure. After a 5-month exposure, at least one of the recovered boxes detected 88% of the 24 T. infestans-positive sites, and 86% of the 7 negative sites by timed manual collections at baseline. In Amamá (Santiago del Estero), the boxes were paired with the best performing prototype tested before (shelter unit). After 3 months, some evidence of infestation was detected in 89% (boxes) and 79% (shelters) of 18-19 sites positive by timed collections, whereas 19% and 16% of 32 negative sites were positive, respectively. Neither device differed significantly in the qualitative or quantitative collection of every sign of infestation. The installation site did not modify significantly the boxes' sampling efficiency in both study areas. As the total cost of each box was half as expensive as each shelter unit, the boxes are thus the most cost-effective and easy-to-use tool for detecting peridomestic T. infestans currently available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Blood sampling is a frequent medical procedure, very often considered as a stressful experience by children. Local anesthetics have been developed, but are expensive and not reimbursed by insurance companies in our country. We wanted to assess parents' willingness to pay (WTP) for this kind of drug. PATIENTS AND METHODS: Over 6 months, all parents of children presenting for general (GV) or specialized visit (SV) with blood sampling. WTP was assessed through three scenarios [avoiding blood sampling (ABS), using the drug on prescription (PD), or over the counter (OTC)], with a payment card system randomized to ascending or descending order of prices (AO or DO). RESULTS: Fifty-six responses were collected (34 GV, 22 SV, 27 AO and 29 DO), response rate 40%. Response distribution was wide, with median WTP of 40 for ABS, 25 for PD, 10 for OTC, which is close to the drug's real price. Responses were similar for GV and SV. Median WTP amounted to 0.71, 0.67, 0.20% of respondents' monthly income for the three scenarios, respectively, with a maximum at 10%. CONCLUSIONS: Assessing parents' WTP in an outpatient setting is difficult, with wide result distribution, but median WTP is close to the real drug price. This finding could be used to promote insurance coverage for this drug.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A modified magnesium hydrogen breath test, using end expiratory breath sampling, is described to investigate achlorhydria. The efficacy of this test in the diagnostic investigation of pernicious anaemia was compared with that of serum pepsinogen I. Twenty one patients with pernicious anaemia--that is, patients with achlorhydria--and 22 with healed duodenal ulcer and normal chlorhydria were studied. Magnesium hydrogen breath test, serum pepsinogen I, serum gastrin, and standard gastric acid secretory tests were performed in all subjects. The mean (SEM) hydrogen peak value was lower in patients with pernicious anaemia than in the duodenal ulcer group (21.7 (1.9) v 71.3 (5.2) ppm; p = 0.00005). The hydrogen peak value had a 95.2% sensitivity and a 100% specificity to detect pentagastrin resistant achlorhydria. Mean serum pepsinogen I concentrations were also significantly lower in patients with pernicious anaemia than in the duodenal ulcer group (10.7 (2.7) v 123.6 (11.8) micrograms/l p = 0.00005). Sensitivity and specificity to detect pernicious anaemia were both 100% for pepsinogen I. It is concluded that this modified magnesium hydrogen breath test is a simple, noninvasive, cost effective, and accurate method to assess achlorhydria and may be useful in the diagnostic investigation of patients with suspected pernicious anaemia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because data on rare species usually are sparse, it is important to have efficient ways to sample additional data. Traditional sampling approaches are of limited value for rare species because a very large proportion of randomly chosen sampling sites are unlikely to shelter the species. For these species, spatial predictions from niche-based distribution models can be used to stratify the sampling and increase sampling efficiency. New data sampled are then used to improve the initial model. Applying this approach repeatedly is an adaptive process that may allow increasing the number of new occurrences found. We illustrate the approach with a case study of a rare and endangered plant species in Switzerland and a simulation experiment. Our field survey confirmed that the method helps in the discovery of new populations of the target species in remote areas where the predicted habitat suitability is high. In our simulations the model-based approach provided a significant improvement (by a factor of 1.8 to 4 times, depending on the measure) over simple random sampling. In terms of cost this approach may save up to 70% of the time spent in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Dried blood spots (DBS) sampling has gained popularity in the bioanalytical community as an alternative to conventional plasma sampling, as it provides numerous benefits in terms of sample collection and logistics. The aim of this work was to show that these advantages can be coupled with a simple and cost-effective sample pretreatment, with subsequent rapid LC-MS/MS analysis for quantitation of 15 benzodiazepines, six metabolites and three Z-drugs. For this purpose, a simplified offline procedure was developed that consisted of letting a 5-µl DBS infuse directly into 100 µl of MeOH, in a conventional LC vial. RESULTS: The parameters related to the DBS pretreatment, such as extraction time or internal standard addition, were investigated and optimized, demonstrating that passive infusion in a regular LC vial was sufficient to quantitatively extract the analytes of interest. The method was validated according to international criteria in the therapeutic concentration ranges of the selected compounds. CONCLUSION: The presented strategy proved to be efficient for the rapid analysis of the selected drugs. Indeed, the offline sample preparation was reduced to a minimum, using a small amount of organic solvent and consumables, without affecting the accuracy of the method. Thus, this approach enables simple and rapid DBS analysis, even when using a non-DBS-dedicated autosampler, while lowering the costs and environmental impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To examine the effects of the world's most challenging mountain ultra-marathon (Tor des Géants(®) 2012) on the energy cost of three types of locomotion (cycling, level and uphill running) and running kinematics. METHODS: Before (pre-) and immediately after (post-) the competition, a group of ten male experienced ultra-marathon runners performed in random order three submaximal 4-min exercise trials: cycling at a power of 1.5 W kg(-1) body mass; level running at 9 km h(-1) and uphill running at 6 km h(-1) at an inclination of +15 % on a motorized treadmill. Two video cameras recorded running mechanics at different sampling rates. RESULTS: Between pre- and post-, the uphill-running energy cost decreased by 13.8 % (P = 0.004); no change was noted in the energy cost of level running or cycling (NS). There was an increase in contact time (+10.3 %, P = 0.019) and duty factor (+8.1 %, P = 0.001) and a decrease in swing time (-6.4 %, P = 0.008) in the uphill-running condition. CONCLUSION: After this extreme mountain ultra-marathon, the subjects modified only their uphill-running patterns for a more economical step mechanics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: An important challenge in conducting social research of specific relevance to harm reduction programs is locating hidden populations of consumers of substances like cannabis who typically report few adverse or unwanted consequences of their use. Much of the deviant, pathologized perception of drug users is historically derived from, and empirically supported, by a research emphasis on gaining ready access to users in drug treatment or in prison populations with higher incidence of problems of dependence and misuse. Because they are less visible, responsible recreational users of illicit drugs have been more difficult to study. Methods: This article investigates Respondent Driven Sampling (RDS) as a method of recruiting experienced marijuana users representative of users in the general population. Based on sampling conducted in a multi-city study (Halifax, Montreal, Toronto, and Vancouver), and compared to samples gathered using other research methods, we assess the strengths and weaknesses of RDS recruitment as a means of gaining access to illicit substance users who experience few harmful consequences of their use. Demographic characteristics of the sample in Toronto are compared with those of users in a recent household survey and a pilot study of Toronto where the latter utilized nonrandom self-selection of respondents. Results: A modified approach to RDS was necessary to attain the target sample size in all four cities (i.e., 40 'users' from each site). The final sample in Toronto was largely similar, however, to marijuana users in a random household survey that was carried out in the same city. Whereas well-educated, married, whites and females in the survey were all somewhat overrepresented, the two samples, overall, were more alike than different with respect to economic status and employment. Furthermore, comparison with a self-selected sample suggests that (even modified) RDS recruitment is a cost-effective way of gathering respondents who are more representative of users in the general population than nonrandom methods of recruitment ordinarily produce. Conclusions: Research on marijuana use, and other forms of drug use hidden in the general population of adults, is important for informing and extending harm reduction beyond its current emphasis on 'at-risk' populations. Expanding harm reduction in a normalizing context, through innovative research on users often overlooked, further challenges assumptions about reducing harm through prohibition of drug use and urges consideration of alternative policies such as decriminalization and legal regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.