990 resultados para sampling methodology


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plasma etch is a key process in modern semiconductor manufacturing facilities as it offers process simplification and yet greater dimensional tolerances compared to wet chemical etch technology. The main challenge of operating plasma etchers is to maintain a consistent etch rate spatially and temporally for a given wafer and for successive wafers processed in the same etch tool. Etch rate measurements require expensive metrology steps and therefore in general only limited sampling is performed. Furthermore, the results of measurements are not accessible in real-time, limiting the options for run-to-run control. This paper investigates a Virtual Metrology (VM) enabled Dynamic Sampling (DS) methodology as an alternative paradigm for balancing the need to reduce costly metrology with the need to measure more frequently and in a timely fashion to enable wafer-to-wafer control. Using a Gaussian Process Regression (GPR) VM model for etch rate estimation of a plasma etch process, the proposed dynamic sampling methodology is demonstrated and evaluated for a number of different predictive dynamic sampling rules. © 2013 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Indoor and ambient air organic pollutants have been gaining attention because they have been measured at levels with possible health effects. Studies have shown that most airborne polychlorinated biphenyls (PCBs), pesticides and many polycyclic aromatic hydrocarbons (PAHs) are present in the free vapor state. The purpose of this research was to extend recent investigative work with polyurethane foam (PUF) as a collection medium for semivolatile compounds. Open-porous flexible PUFs with different chemical makeup and physical properties were evaluated as to their collection affinities/efficiencies for various classes of compounds and the degree of sample recovery. Filtered air samples were pulled through plugs of PUF spiked with various semivolatiles under different simulated environmental conditions (temperature and humidity), and sampling parameters (flow rate and sample volume) in order to measure their effects on sample breakthrough volume (V(,B)). PUF was also evaluated in the passive mode using organo-phosphorus pesticides. Another major goal was to improve the overall analytical methodology; PUF is inexpensive, easy to handle in the field and has excellent airflow characteristics (low pressure drop). It was confirmed that the PUF collection apparatus behaves as if it were a gas-solid chromatographic system, in that, (V(,B)) was related to temperature and sample volume. Breakthrough volumes were essentially the same using both polyether and polyester type PUF. Also, little change was observed in the V(,B)s after coating PUF with common chromatographic liquid phases. Open cell (reticulated) foams gave better recoveries than closed cell foams. There was a slight increase in (V(,B)) with an increase in the number of cells/pores per inch. The high-density polyester PUF was found to be an excellent passive and active collection adsorbent. Good recoveries could be obtained using just solvent elution. A gas chromatograph equipped with a photoionization detector gave excellent sensitivities and selectivities for the various classes of compounds investigated. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

"February 1990."

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Lack of time to implement pharmaceutical care has been cited as a barrier to the routine provision of this extended patient-care service. Using self-reported work sampling methodology, this study investigated how community pharmacists utilise their time. Pharmacists working in community pharmacies in the Greater Belfast area were found to spend approximately 49% of their time engaged in professional activities, 29% in semi-professional activities and 22% involved in non-professional activities. The activity to which pharmacists devoted the majority of their time was product assembly and labelling, this being a task which can be performed by trained technical staff. Only 9.5% of community pharmacists' time was devoted to counselling patients on their prescription medicines. Wide variation in the amount of time apportioned to each activity was observed between the participating community pharmacists (n=30). Staffing levels within the community pharmacy were found to significantly influence pharmacists' involvement in a number of activities, with pharmacists who worked in pharmacies employing multiple pharmacists devoting more time to the assembly and labelling of products and less time to administrative tasks, non-professional encounters and to miscellaneous professional activities. Pharmacists working in pharmacies with a high prescription turnover were found to devote significantly less time to counselling patients regarding OTC products and in responding to patient symptoms.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

One of the fundamental machine learning tasks is that of predictive classification. Given that organisations collect an ever increasing amount of data, predictive classification methods must be able to effectively and efficiently handle large amounts of data. However, it is understood that present requirements push existing algorithms to, and sometimes beyond, their limits since many classification prediction algorithms were designed when currently common data set sizes were beyond imagination. This has led to a significant amount of research into ways of making classification learning algorithms more effective and efficient. Although substantial progress has been made, a number of key questions have not been answered. This dissertation investigates two of these key questions. The first is whether different types of algorithms to those currently employed are required when using large data sets. This is answered by analysis of the way in which the bias plus variance decomposition of predictive classification error changes as training set size is increased. Experiments find that larger training sets require different types of algorithms to those currently used. Some insight into the characteristics of suitable algorithms is provided, and this may provide some direction for the development of future classification prediction algorithms which are specifically designed for use with large data sets. The second question investigated is that of the role of sampling in machine learning with large data sets. Sampling has long been used as a means of avoiding the need to scale up algorithms to suit the size of the data set by scaling down the size of the data sets to suit the algorithm. However, the costs of performing sampling have not been widely explored. Two popular sampling methods are compared with learning from all available data in terms of predictive accuracy, model complexity, and execution time. The comparison shows that sub-sampling generally products models with accuracy close to, and sometimes greater than, that obtainable from learning with all available data. This result suggests that it may be possible to develop algorithms that take advantage of the sub-sampling methodology to reduce the time required to infer a model while sacrificing little if any accuracy. Methods of improving effective and efficient learning via sampling are also investigated, and now sampling methodologies proposed. These methodologies include using a varying-proportion of instances to determine the next inference step and using a statistical calculation at each inference step to determine sufficient sample size. Experiments show that using a statistical calculation of sample size can not only substantially reduce execution time but can do so with only a small loss, and occasional gain, in accuracy. One of the common uses of sampling is in the construction of learning curves. Learning curves are often used to attempt to determine the optimal training size which will maximally reduce execution time while nut being detrimental to accuracy. An analysis of the performance of methods for detection of convergence of learning curves is performed, with the focus of the analysis on methods that calculate the gradient, of the tangent to the curve. Given that such methods can be susceptible to local accuracy plateaus, an investigation into the frequency of local plateaus is also performed. It is shown that local accuracy plateaus are a common occurrence, and that ensuring a small loss of accuracy often results in greater computational cost than learning from all available data. These results cast doubt over the applicability of gradient of tangent methods for detecting convergence, and of the viability of learning curves for reducing execution time in general.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Gambling prevalence studies are typically conducted within a single (landline) telephone sampling frame. This practice continues, despite emerging evidence that significant differences exist between landline and mobile (cell) phone only households. This study utilised a dual-frame (landline and mobile) telephone sampling methodology to cast light on the extent of differences across groups of respondents in respect to demographic, health, and gambling characteristics.

A total of 2,014 participants from across Australian states and 
territories ranging in age from 18 to 96 years participated. Interviews were conducted using computer assisted telephone interviewing technology where 1,012 respondents from the landline sampling frame and 1,002 from the mobile phone sampling frame completed a questionnaire about gambling and other health behaviours. Responses across the landline sampling frame, the mobile phone sampling frame, and the subset of the mobile phone sampling frame that possessed a mobile phone only (MPO) were contrasted.

The findings 
revealed that although respondents in the landline sample (62.7 %) did not significantly differ from respondents in the mobile phone sample (59.2 %) in gambling participation in the previous 12 months, they were significantly more likely to have gambled in the previous 12 months than the MPO sample (56.4 %). There were no significant differences in internet gambling participation over the previous 12 months in the landline sample (4.7 %), mobile phone sample (4.7 %) and the MPO sample (5.0 %). However, endorsement of lifetime problem gambling on the NODS-CLiP was significantly higher within the mobile sample (10.7 %) and the MPO sample (14.8 %) than the landline sample (6.6 %).

Our research 
supports previous findings that reliance on a traditional landline telephone sampling approach effectively excludes distinct subgroups of the population from being represented inresearch findings. Consequently, we suggest that research best practice necessitates the use of a dual- rame sampling methodology. Despite inherent logistical and cost issues, this approach  needs to become the norm in gambling survey research.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Despite growing popularity of experience sampling methodology (ESM) for evaluations of state-based components of body image, there have been concerns that the frequent repeated measurement might encourage problematic responding resulting in low data quantity and/or quality. Using a sample of 105 women (mean age = 24.84), this study used multilevel modelling to investigate whether (a) there were changes in compliance or response variability across a 7-day period, and (b) whether such changes are explained by participant characteristics. Present findings suggest that demands of ESM protocol undermine quantity more so than quality of obtained data. Decline in procedural compliance across the testing period correlated with BMI and body shame, whereas reduced variability in state-based assessments did not adversely impact the strength of association between state body satisfaction ratings and other variables in the dataset. The authors make several recommendations for ensuring the quality of ESM-based data in future studies.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

IBAMar (http://www.ba.ieo.es/ibamar) is a regional database that puts together all physical and biochemical data obtained by multiparametric probes (CTDs equipped with different sensors), during the cruises managed by the Balearic Center of the Spanish Institute of Oceanography (COB-IEO). It has been recently extended to include data obtained with classical hydro casts using oceanographic Niskin or Nansen bottles. The result is a database that includes a main core of hydrographic data: temperature (T), salinity (S), dissolved oxygen (DO), fluorescence and turbidity; complemented by bio-chemical data: dissolved inorganic nutrients (phosphate, nitrate, nitrite and silicate) and chlorophyll-a. In IBAMar Database, different technologies and methodologies were used by different teams along the four decades of data sampling in the COB-IEO. Despite of this fact, data have been reprocessed using the same protocols, and a standard QC has been applied to each variable. Therefore it provides a regional database of homogeneous, good quality data. Data acquisition and quality control (QC): 94% of the data are CTDs Sbe911 and Sbe25. S and DO were calibrated on board using water samples, whenever a Rossetta was available (70% of the cases). All CTD data from Seabird CTDs were reviewed and post processed with the software provided by Sea-Bird Electronics. Data were averaged to get 1 dbar vertical resolution. General sampling methodology and pre processing are described in https://ibamardatabase.wordpress.com/home/). Manual QC include visual checks of metadata, duplicate data and outliers. Automatic QC include range check of variables by area (north of Balearic Islands, south of BI and Alboran Sea) and depth (27 standard levels), check for spikes and check for density inversions. Nutrients QC includes a preliminary control and a range check on the observed level of the data to detect outliers around objectively analyzed data fields. A quality flag is assigned as an integer number, depending on the result of the QC check.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundación Ciudad de la Energía (CIUDEN) is carrying out a project of geological storage of CO2, where CO2 injection tests are planned in saline aquifers at a depth of 1500 m for scientific objectives and project demonstration. Before any CO2 is stored, it is necessary to determine the baseline flux of CO2 in order to detect potential leakage during injection and post-injection monitoring. In November 2009 diffuse flux measurements of CO2 using an accumulationchamber were made in the area selected by CIUDEN for geological storage, located in Hontomin province of Burgos (Spain). This paper presents the tests carried out in order to establish the optimum sampling methodology and the geostatistical analyses performed to determine the range, with which future field campaigns will be planned.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Fundación Ciudad de la Energía (CIUDEN) is carrying out a project of geological storage of CO2, where CO2 injection tests are planned in saline aquifers at a depth of 1500 m for scientific objectives and project demonstration. Before any CO2 is stored, it is necessary to determine the baseline flux of CO2 in order to detect potential leakage during injection and post-injection monitoring. In November 2009 diffuse flux measurements of CO2 using an accumulation chamber were made in the area selected by CIUDEN for geological storage, located in Hontomin province of Burgos (Spain). This paper presents the tests carried out in order to establish the optimum sampling methodology and the geostatistical analyses performed to determine the range, with which future field campaigns will be planned.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This paper, based on the outcome of discussions at a NORMAN Network-supported workshop in Lyon (France) in November 2014 aims to provide a common position of passive sampling community experts regarding concrete actions required to foster the use of passive sampling techniques in support of contaminant risk assessment and management and for routine monitoring of contaminants in aquatic systems. The brief roadmap presented here focusses on the identification of robust passive sampling methodology, technology that requires further development or that has yet to be developed, our current knowledge of the evaluation of uncertainties when calculating a freely dissolved concentration, the relationship between data from PS and that obtained through biomonitoring. A tiered approach to identifying areas of potential environmental quality standard (EQS) exceedances is also shown. Finally, we propose a list of recommended actions to improve the acceptance of passive sampling by policy-makers. These include the drafting of guidelines, quality assurance and control procedures, developing demonstration projects where biomonitoring and passive sampling are undertaken alongside, organising proficiency testing schemes and interlaboratory comparison and, finally, establishing passive sampler-based assessment criteria in relation to existing EQS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An automated gas sampling methodology has been used to estimate nitrous oxide (N2O) emissions from heavy black clay soil in northern Australia where split applications of urea were applied to furrow irrigated cotton. Nitrous oxide emissions from the beds were 643 g N/ha over the 188 day measurement period (after planting), whilst the N2O emissions from the furrows were significantly higher at 967 g N/ha. The DNDC model was used to develop a full season simulation of N2O and N2 emissions. Seasonal N2O emissions were equivalent to 0.83% of applied N, with total gaseous N losses (excluding NH3) estimated to be 16% of the applied N.