926 resultados para Mate sampling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: This methodological paper reports on the development and validation of a work sampling instrument and data collection processes to conduct a national study of nurse practitioners’ work patterns. ---------- Design: Published work sampling instruments provided the basis for development and validation of a tool for use in a national study of nurse practitioner work activities across diverse contextual and clinical service models. Steps taken in the approach included design of a nurse practitioner-specific data collection tool and development of an innovative web-based program to train and establish inter rater reliability of a team of data collectors who were geographically dispersed across metropolitan, rural and remote health care settings. ---------- Setting: The study is part of a large funded study into nurse practitioner service. The Australian Nurse Practitioner Study is a national study phased over three years and was designed to provide essential information for Australian health service planners, regulators and consumer groups on the profile, process and outcome of nurse practitioner service. ---------- Results: The outcome if this phase of the study is empirically tested instruments, process and training materials for use in an international context by investigators interested in conducting a national study of nurse practitioner work practices. ---------- Conclusion: Development and preparation of a new approach to describing nurse practitioner practices using work sampling methods provides the groundwork for international collaboration in evaluation of nurse practitioner service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: This paper is a report of a study of variations in the pattern of nurse practitioner work in a range of service fields and geographical locations, across direct patient care, indirect patient care and service-related activities. Background. The nurse practitioner role has been implemented internationally as a service reform model to improve the access and timeliness of health care. There is a substantial body of research into the nurse practitioner role and service outcomes, but scant information on the pattern of nurse practitioner work and how this is influenced by different service models. --------- Methods: We used work sampling methods. Data were collected between July 2008 and January 2009. Observations were recorded from a random sample of 30 nurse practitioners at 10-minute intervals in 2-hour blocks randomly generated to cover two weeks of work time from a sampling frame of six weeks. --------- Results: A total of 12,189 individual observations were conducted with nurse practitioners across Australia. Thirty individual activities were identified as describing nurse practitioner work, and these were distributed across three categories. Direct care accounted for 36.1% of how nurse practitioners spend their time, indirect care accounted for 32.2% and service-related activities made up 31.9%. --------- Conclusion. These findings provide useful baseline data for evaluation of nurse practitioner positions and the service effect of these positions. However, the study also raises questions about the best use of nurse practitioner time and the influences of barriers to and facilitators of this model of service innovation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ocean processes are dynamic, complex, and occur on multiple spatial and temporal scales. To obtain a synoptic view of such processes, ocean scientists collect data over long time periods. Historically, measurements were continually provided by fixed sensors, e.g., moorings, or gathered from ships. Recently, an increase in the utilization of autonomous underwater vehicles has enabled a more dynamic data acquisition approach. However, we still do not utilize the full capabilities of these vehicles. Here we present algorithms that produce persistent monitoring missions for underwater vehicles by balancing path following accuracy and sampling resolution for a given region of interest, which addresses a pressing need among ocean scientists to efficiently and effectively collect high-value data. More specifically, this paper proposes a path planning algorithm and a speed control algorithm for underwater gliders, which together give informative trajectories for the glider to persistently monitor a patch of ocean. We optimize a cost function that blends two competing factors: maximize the information value along the path, while minimizing deviation from the planned path due to ocean currents. Speed is controlled along the planned path by adjusting the pitch angle of the underwater glider, so that higher resolution samples are collected in areas of higher information value. The resulting paths are closed circuits that can be repeatedly traversed to collect long-term ocean data in dynamic environments. The algorithms were tested during sea trials on an underwater glider operating off the coast of southern California, as well as in Monterey Bay, California. The experimental results show significant improvements in data resolution and path reliability compared to previously executed sampling paths used in the respective regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reports the feasibility and methodological considerations of using the Short Message System Experience Sampling (SMS-ES) Method, which is an experience sampling research method developed to assist researchers to collect repeat measures of consumers’ affective experiences. The method combines SMS with web-based technology in a simple yet effective way. It is described using a practical implementation study that collected consumers’ emotions in response to using mobile phones in everyday situations. The method is further evaluated in terms of the quality of data collected in the study, as well as against the methodological considerations for experience sampling studies. These two evaluations suggest that the SMS-ES Method is both a valid and reliable approach for collecting consumers’ affective experiences. Moreover, the method can be applied across a range of for-profit and not-for-profit contexts where researchers want to capture repeated measures of consumers’ affective experiences occurring over a period of time. The benefits of the method are discussed to assist researchers who wish to apply the SMS-ES Method in their own research designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose of review: This review provides an overview on the importance of characterising and considering insect distribution infor- mation for designing stored commodity sampling protocols. Findings: Sampling protocols are influenced by a number of factors including government regulations, management practices, new technology and current perceptions of the status of insect pest damage. The spatial distribution of insects in stored commodities influ- ences the efficiency of sampling protocols; these can vary in response to season, treatment and other factors. It is important to use sam- pling designs based on robust statistics suitable for the purpose. Future research: The development of sampling protocols based on flexible, robust statistics allows for accuracy across a range of spatial distributions. Additionally, power can be added to sampling protocols through the integration of external information such as treatment history and climate. Bayesian analysis provides a coherent and well understood means to achieve this.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vernier acuity, a form of visual hyperacuity, is amongst the most precise forms of spatial vision. Under optimal conditions Vernier thresholds are much finer than the inter-photoreceptor distance. Achievement of such high precision is based substantially on cortical computations, most likely in the primary visual cortex. Using stimuli with added positional noise, we show that Vernier processing is reduced with advancing age across a wide range of noise levels. Using an ideal observer model, we are able to characterize the mechanisms underlying age-related loss, and show that the reduction in Vernier acuity can be mainly attributed to the reduction in efficiency of sampling, with no significant change in the level of internal position noise, or spatial distortion, in the visual system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective, statistically robust sampling and surveillance strategies form an integral component of large agricultural industries such as the grains industry. Intensive in-storage sampling is essential for pest detection, Integrated Pest Management (IPM), to determine grain quality and to satisfy importing nation’s biosecurity concerns, while surveillance over broad geographic regions ensures that biosecurity risks can be excluded, monitored, eradicated or contained within an area. In the grains industry, a number of qualitative and quantitative methodologies for surveillance and in-storage sampling have been considered. Primarily, research has focussed on developing statistical methodologies for in storage sampling strategies concentrating on detection of pest insects within a grain bulk, however, the need for effective and statistically defensible surveillance strategies has also been recognised. Interestingly, although surveillance and in storage sampling have typically been considered independently, many techniques and concepts are common between the two fields of research. This review aims to consider the development of statistically based in storage sampling and surveillance strategies and to identify methods that may be useful for both surveillance and in storage sampling. We discuss the utility of new quantitative and qualitative approaches, such as Bayesian statistics, fault trees and more traditional probabilistic methods and show how these methods may be used in both surveillance and in storage sampling systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic sensors provide an effective means of monitoring biodiversity at large spatial and temporal scales. They can continuously and passively record large volumes of data over extended periods, however these data must be analysed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced users can produce accurate results, however the time and effort required to process even small volumes of data can make manual analysis prohibitive. Our research examined the use of sampling methods to reduce the cost of analysing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilising five days of manually analysed acoustic sensor data from four sites, we examined a range of sampling rates and methods including random, stratified and biologically informed. Our findings indicate that randomly selecting 120, one-minute samples from the three hours immediately following dawn provided the most effective sampling method. This method detected, on average 62% of total species after 120 one-minute samples were analysed, compared to 34% of total species from traditional point counts. Our results demonstrate that targeted sampling methods can provide an effective means for analysing large volumes of acoustic sensor data efficiently and accurately.