916 resultados para Sampling (Statistics)
Resumo:
Nicotine in a smoky indoor air environment can be determined using graphitized carbon black as a solid sorbent in quartz tubes. The temperature stability, high purity, and heat absorption characteristics of the sorbent, as well as the permeability of the quartz tubes to microwaves, enable the thermal desorption by means of microwaves after active sampling. Permeation and dynamic dilution procedures for the generation of nicotine in the vapor phase at low and high concentrations are used to evaluate the performances of the sampler. Tube preparation is described and the microwave desorption temperature is measured. Breakthrough volume is determined to allow sampling at 0.1-1 L/min for definite periods of time. The procedure is tested for the determination of gas and paticulate phase nicotine in sidestream smoke produced in an experimental chamber.
Resumo:
Statistics has become an indispensable tool in biomedical research. Thanks, in particular, to computer science, the researcher has easy access to elementary "classical" procedures. These are often of a "confirmatory" nature: their aim is to test hypotheses (for example the efficacy of a treatment) prior to experimentation. However, doctors often use them in situations more complex than foreseen, to discover interesting data structures and formulate hypotheses. This inverse process may lead to misuse which increases the number of "statistically proven" results in medical publications. The help of a professional statistician thus becomes necessary. Moreover, good, simple "exploratory" techniques are now available. In addition, medical data contain quite a high percentage of outliers (data that deviate from the majority). With classical methods it is often very difficult (even for a statistician!) to detect them and the reliability of results becomes questionable. New, reliable ("robust") procedures have been the subject of research for the past two decades. Their practical introduction is one of the activities of the Statistics and Data Processing Department of the University of Social and Preventive Medicine, Lausanne.
Resumo:
Coraebus undatus is the main insect pest of cork oak worldwide. The larvae tunnel in the cortical cambium filling the bark with galleries and causing the cork to break at harvest. The first objective of this study was to test the effect of purple traps in the attraction of C. undatus because this colour is attractive to other buprestid beetles. The second objective was to develop a diet in which field-collected larvae could be reared to adulthood. Pairs of purple and clear (control) sticky traps were placed in a cork oak forest in Girona, Spain in the summer of 2008
Resumo:
This paper aims at detecting spatio-temporal clustering in fire sequences using space?time scan statistics, a powerful statistical framework for the analysis of point processes. The methodology is applied to active fire detection in the state of Florida (US) identified by MODIS (Moderate Resolution Imaging Spectroradiometer) during the period 2003?06. Results of the present study show that statistically significant clusters can be detected and localized in specific areas and periods of the year. Three out of the five most likely clusters detected for the entire frame period are localized in the north of the state, and they cover forest areas; the other two clusters cover a large zone in the south, corresponding to agricultural land and the prairies in the Everglades. In order to analyze if the wildfires recur each year during the same period, the analyses have been performed separately for the 4 years: it emerges that clusters of forest fires are more frequent in hot seasons (spring and summer), while in the southern areas, they are widely present during the whole year. The recognition of overdensities of events and the ability to locate them in space and in time can help in supporting fire management and focussing on prevention measures.
Resumo:
BACKGROUND: The need to contextualise wastewater-based figures about illicit drug consumption by comparing them with other indicators has been stressed by numerous studies. The objective of the present study was to further investigate the possibility of combining wastewater data to conventional statistics to assess the reliability of the former method and obtain a more balanced picture of illicit drug consumption in the investigated area. METHODS: Wastewater samples were collected between October 2013 and July 2014 in the metropolitan area of Lausanne (226,000 inhabitants), Switzerland. Methadone, its metabolite 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine (EDDP), the exclusive metabolite of heroin, 6-monoacetylmorphine (6-MAM), and morphine loads were used to estimate the amounts of methadone and heroin consumed. RESULTS: Methadone consumption estimated from EDDP was in agreement with the expectations. Heroin estimates based on 6-MAM loads were inconsistent. Estimates obtained from morphine loads, combined to prescription/sales data, were in agreement with figures derived from syringe distribution data and general population surveys. CONCLUSIONS: The results obtained for methadone allowed assessing the reliability of the selected sampling strategy, supporting its ability to capture the consumption of a small cohort (i.e., 743 patients). Using morphine as marker, in combination with prescription/sales data, estimates in accordance with other indicators about heroin use were obtained. Combining different sources of data allowed strengthening the results and suggested that the different indicators (i.e., administration route, average dosage and number of consumers) contribute to depict a realistic representation of the phenomenon in the investigated area. Heroin consumption was estimated to approximately 13gday(-1) (118gday(-1) at street level).
Resumo:
Decision situations are often characterized by uncertainty: we do not know the values of the different options on all attributes and have to rely on information stored in our memory to decide. Several strategies have been proposed to describe how people make inferences based on knowledge used as cues. The present research shows how declarative memory of ACT-R models could be populated based on internet statistics. This will allow to simulate the performance of decision strategies operating on declarative knowledge based on occurrences and co-occurrences of objects and cues in the environment.
Resumo:
The objective of this work was to combine the advantages of the dried blood spot (DBS) sampling process with the highly sensitive and selective negative-ion chemical ionization tandem mass spectrometry (NICI-MS-MS) to analyze for recent antidepressants including fluoxetine, norfluoxetine, reboxetine, and paroxetine from micro whole blood samples (i.e., 10 microL). Before analysis, DBS samples were punched out, and antidepressants were simultaneously extracted and derivatized in a single step by use of pentafluoropropionic acid anhydride and 0.02% triethylamine in butyl chloride for 30 min at 60 degrees C under ultrasonication. Derivatives were then separated on a gas chromatograph coupled with a triple-quadrupole mass spectrometer operating in negative selected reaction monitoring mode for a total run time of 5 min. To establish the validity of the method, trueness, precision, and selectivity were determined on the basis of the guidelines of the "Société Française des Sciences et des Techniques Pharmaceutiques" (SFSTP). The assay was found to be linear in the concentration ranges 1 to 500 ng mL(-1) for fluoxetine and norfluoxetine and 20 to 500 ng mL(-1) for reboxetine and paroxetine. Despite the small sampling volume, the limit of detection was estimated at 20 pg mL(-1) for all the analytes. The stability of DBS was also evaluated at -20 degrees C, 4 degrees C, 25 degrees C, and 40 degrees C for up to 30 days. Furthermore, the method was successfully applied to a pharmacokinetic investigation performed on a healthy volunteer after oral administration of a single 40-mg dose of fluoxetine. Thus, this validated DBS method combines an extractive-derivative single step with a fast and sensitive GC-NICI-MS-MS technique. Using microliter blood samples, this procedure offers a patient-friendly tool in many biomedical fields such as checking treatment adherence, therapeutic drug monitoring, toxicological analyses, or pharmacokinetic studies.
Resumo:
BACKGROUND & AIMS: Trace elements (TE) are involved in the immune and antioxidant defences which are of particular importance during critical illness. Determining plasma TE levels is costly. The present quality control study aimed at assessing the economic impact of a computer reminded blood sampling versus a risk guided on-demand monitoring of plasma concentrations of selenium, copper, and zinc. METHODS: Retrospective analysis of 2 cohorts of patients admitted during 6 months periods in 2006 and 2009 to the ICU of a University hospital. INCLUSION CRITERIA: to receive intravenous micronutrient supplements and/or to have a TE sampling during ICU stay. The TE samplings were triggered by computerized reminder in 2006 versus guided by nutritionists in 2009. RESULTS: During the 2 periods 636 patients met the inclusion criteria out of 2406 consecutive admissions, representing 29.7% and 24.9% respectively of the periods' admissions. The 2009 patients had higher SAPS2 scores (p = 0.02) and lower BMI compared to 2006 (p = 0.007). The number of laboratory determinations was drastically reduced in 2009, particularly during the first week, despite the higher severity of the cohort, resulting in à 55% cost reduction. CONCLUSIONS: The monitoring of TE concentrations guided by a nutritionist resulted in a reduction of the sampling frequency, and targeting on the sickest high risk patients, requiring a nutritional prescription adaptation. This control leads to cost reduction compared to an automated sampling prescription.
Resumo:
In this article, the fusion of a stochastic metaheuristic as Simulated Annealing (SA) with classical criteria for convergence of Blind Separation of Sources (BSS), is shown. Although the topic of BSS, by means of various techniques, including ICA, PCA, and neural networks, has been amply discussed in the literature, to date the possibility of using simulated annealing algorithms has not been seriously explored. From experimental results, this paper demonstrates the possible benefits offered by SA in combination with high order statistical and mutual information criteria for BSS, such as robustness against local minima and a high degree of flexibility in the energy function.
Resumo:
Conferència Organitzada per l'Escola Politècnica Superior, Universitat de Vic en col·laboració amb Servei d'Estadística de la Universitat Autònoma de Barcelona i CosmoCaixa Barcelona. Celebrada del 18 al 22 de juny de 2012 a Barcelona