916 resultados para Sheet-metal work - Simulation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: This paper is a report of a study of variations in the pattern of nurse practitioner work in a range of service fields and geographical locations, across direct patient care, indirect patient care and service-related activities. Background. The nurse practitioner role has been implemented internationally as a service reform model to improve the access and timeliness of health care. There is a substantial body of research into the nurse practitioner role and service outcomes, but scant information on the pattern of nurse practitioner work and how this is influenced by different service models. --------- Methods: We used work sampling methods. Data were collected between July 2008 and January 2009. Observations were recorded from a random sample of 30 nurse practitioners at 10-minute intervals in 2-hour blocks randomly generated to cover two weeks of work time from a sampling frame of six weeks. --------- Results: A total of 12,189 individual observations were conducted with nurse practitioners across Australia. Thirty individual activities were identified as describing nurse practitioner work, and these were distributed across three categories. Direct care accounted for 36.1% of how nurse practitioners spend their time, indirect care accounted for 32.2% and service-related activities made up 31.9%. --------- Conclusion. These findings provide useful baseline data for evaluation of nurse practitioner positions and the service effect of these positions. However, the study also raises questions about the best use of nurse practitioner time and the influences of barriers to and facilitators of this model of service innovation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standard Monte Carlo (sMC) simulation models have been widely used in AEC industry research to address system uncertainties. Although the benefits of probabilistic simulation analyses over deterministic methods are well documented, the sMC simulation technique is quite sensitive to the probability distributions of the input variables. This phenomenon becomes highly pronounced when the region of interest within the joint probability distribution (a function of the input variables) is small. In such cases, the standard Monte Carlo approach is often impractical from a computational standpoint. In this paper, a comparative analysis of standard Monte Carlo simulation to Markov Chain Monte Carlo with subset simulation (MCMC/ss) is presented. The MCMC/ss technique constitutes a more complex simulation method (relative to sMC), wherein a structured sampling algorithm is employed in place of completely randomized sampling. Consequently, gains in computational efficiency can be made. The two simulation methods are compared via theoretical case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, I present a number of leading examples in the empirical literature that use simulation-based estimation methods. For each example, I describe the model, why simulation is needed, and how to simulate the relevant object. There is a section on simulation methods and another on simulations-based estimation methods. The paper concludes by considering the significance of each of the examples discussed a commenting on potential future areas of interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim Evaluation or assessment of competence is an important step to ensure the safety and efficacy of health professionals, including dietitians. Most competency-based assessment studies are focussed on valid and reliable methods of assessment for the preparation of entry-level dietitians, few papers have explored student dietitians’ perceptions of these evaluations. This study aimed to explore the perceptions of recent graduates from accredited nutrition and dietetics training programs in Australia. It also aimed to establish the relevance of competency-based assessment to adequately prepare them for entry-level work roles. Methods A purposive sample of newly-graduated dietitians with a range of assessment experiences and varied employment areas was recruited. A qualitative approach, using in-depth interviews with 13 graduates, with differing assessment experiences was undertaken. Graduates were asked to reflect upon their competency-based assessment experiences whilst a student. Data was thematically analysed by multiple authors. Results Four themes emerged from the data analysis: (i) Transparency and consistency are critical elements of work-based competency assessment. (ii) Students are willing to take greater responsibility in their assessment process. (iii) Work-based competency assessment prepares students for employment. (iv) The relationship between students and their assessors can impact on the student experience and their assessment performance. Conclusions Understanding this unique perspective of students can improve evaluation of future health professionals and assist in designing valid competency-based assessment approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

While the indirect and direct cost of occupational musculoskeletal disorders (MSD) causes a significant burden on the health system, lower back pain (LBP) is associated with a significant portion of MSD. In Australia, the highest prevalence of MSD exists for health care workers, such as nurses. The digital human model (DHM) Siemens JACK was used to investigate if hospital bed pushing, a simple task and hazard that is commonly associated with LBP, can be simulated and ergonomically assessed in a virtual environment. It was found that while JACK has implemented a range of common physical work assessment methods, the simulation of dynamic bed pushing remains a challenge due to the complex interface between the floor and wheels, which can only be insufficiently modelle

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper the issue of finding uncertainty intervals for queries in a Bayesian Network is reconsidered. The investigation focuses on Bayesian Nets with discrete nodes and finite populations. An earlier asymptotic approach is compared with a simulation-based approach, together with further alternatives, one based on a single sample of the Bayesian Net of a particular finite population size, and another which uses expected population sizes together with exact probabilities. We conclude that a query of a Bayesian Net should be expressed as a probability embedded in an uncertainty interval. Based on an investigation of two Bayesian Net structures, the preferred method is the simulation method. However, both the single sample method and the expected sample size methods may be useful and are simpler to compute. Any method at all is more useful than none, when assessing a Bayesian Net under development, or when drawing conclusions from an ‘expert’ system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Friction can influence the quality of the finished product to a large extent in certain manufacturing processes. Sheet metal forming is a particular case, where the friction between the hard-die and the relatively soft work-piece can be extremely important. Under such conditions, topography of the harder surface can influence the resistance to traction at the interface. This paper discusses about the correlation between certain features of the surface; topography and coefficient of friction based on experiments involving sliding of a few soft metal pins against a harder material. A brief description of the experimental procedure and the analysis are presented. A hybrid parameter which encapsulates both the amplitude features as well as the relative packing of peaks is shown to correlate well with the coefficient of friction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Coronary tortuosity (CT) is a common coronary angiographic finding. Whether CT leads to an apparent reduction in coronary pressure distal to the tortuous segment of the coronary artery is still unknown. The purpose of this study is to determine the impact of CT on coronary pressure distribution by numerical simulation. Methods: 21 idealized models were created to investigate the influence of coronary tortuosity angle (CTA) and coronary tortuosity number (CTN) on coronary pressure distribution. A 2D incompressible Newtonian flow was assumed and the computational simulation was performed using finite volume method. CTA of 30°, 60°, 90°, 120° and CTN of 0, 1, 2, 3, 4, 5 were discussed under both steady and pulsatile conditions, and the changes of outlet pressure and inlet velocity during the cardiac cycle were considered. Results: Coronary pressure distribution was affected both by CTA and CTN. We found that the pressure drop between the start and the end of the CT segment decreased with CTA, and the length of the CT segment also declined with CTA. An increase in CTN resulted in an increase in the pressure drop. Conclusions: Compared to no-CT, CT can results in more decrease of coronary blood pressure in dependence on the severity of tortuosity and severe CT may cause myocardial ischemia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Predatory insects and spiders are key elements of integrated pest management (IPM) programmes in agricultural crops such as cotton. Management decisions in IPM programmes should to be based on a reliable and efficient method for counting both predators and pests. Knowledge of the temporal constraints that influence sampling is required because arthropod abundance estimates are likely to vary over a growing season and within a day. Few studies have adequately quantified this effect using the beat sheet, a potentially important sampling method. We compared the commonly used methods of suction and visual sampling to the beat sheet, with reference to an absolute cage clamp method for determining the abundance of various arthropod taxa over 5 weeks. There were significantly more entomophagous arthropods recorded using the beat sheet and cage clamp methods than by using suction or visual sampling, and these differences were more pronounced as the plants grew. In a second trial, relative estimates of entomophagous and phytophagous arthropod abundance were made using beat sheet samples collected over a day. Beat sheet estimates of the abundance of only eight of the 43 taxa examined were found to vary significantly over a day. Beat sheet sampling is recommended in further studies of arthropod abundance in cotton, but researchers and pest management advisors should bear in mind the time of season and time of day effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the extent to which individuals’ global motivation (self-determined and non-self-determined types) influences adjustment (anxiety, positive reappraisal) and engagement (intrinsic motivation, task performance) in reaction to changes to the level of work control available during a work simulation. Participants (N = 156) completed 2 trials of an inbox activity under conditions of low or high work control—with the ordering of these levels varied to create an increase, decrease, or no change in work control. In support of the hypotheses, results revealed that for more self-determined individuals, high work control led to the increased use of positive reappraisal. Follow-up moderated mediation analyses revealed that the increases in positive reappraisal observed for self-determined individuals in the conditions in which work control was high by Trial 2 consequently increased their intrinsic motivation toward the task. For more non-self-determined individuals, high work control (as well as changes in work control) led to elevated anxiety. Follow-up moderated mediation analyses revealed that the increases in anxiety observed for non-self-determined individuals in the high-to-high work control condition consequently reduced their task performance. It is concluded that adjustment to a demanding work task depends on a fit between individuals’ global motivation and the work control available, which has consequences for engagement with demanding work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, separation methods have been developed for the analysis of anthropogenic transuranium elements plutonium, americium, curium and neptunium from environmental samples contaminated by global nuclear weapons testing and the Chernobyl accident. The analytical methods utilized in this study are based on extraction chromatography. Highly varying atmospheric plutonium isotope concentrations and activity ratios were found at both Kurchatov (Kazakhstan), near the former Semipalatinsk test site, and Sodankylä (Finland). The origin of plutonium is almost impossible to identify at Kurchatov, since hundreds of nuclear tests were performed at the Semipalatinsk test site. In Sodankylä, plutonium in the surface air originated from nuclear weapons testing, conducted mostly by USSR and USA before the sampling year 1963. The variation in americium, curium and neptunium concentrations was great as well in peat samples collected in southern and central Finland in 1986 immediately after the Chernobyl accident. The main source of transuranium contamination in peats was from global nuclear test fallout, although there are wide regional differences in the fraction of Chernobyl-originated activity (of the total activity) for americium, curium and neptunium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work develops methods to account for shoot structure in models of coniferous canopy radiative transfer. Shoot structure, as it varies along the light gradient inside canopy, affects the efficiency of light interception per unit needle area, foliage biomass, or foliage nitrogen. The clumping of needles in the shoot volume also causes a notable amount of multiple scattering of light within coniferous shoots. The effect of shoot structure on light interception is treated in the context of canopy level photosynthesis and resource use models, and the phenomenon of within-shoot multiple scattering in the context of physical canopy reflectance models for remote sensing purposes. Light interception. A method for estimating the amount of PAR (Photosynthetically Active Radiation) intercepted by a conifer shoot is presented. The method combines modelling of the directional distribution of radiation above canopy, fish-eye photographs taken at shoot locations to measure canopy gap fraction, and geometrical measurements of shoot orientation and structure. Data on light availability, shoot and needle structure and nitrogen content has been collected from canopies of Pacific silver fir (Abies amabilis (Dougl.) Forbes) and Norway spruce (Picea abies (L.) Karst.). Shoot structure acclimated to light gradient inside canopy so that more shaded shoots have better light interception efficiency. Light interception efficiency of shoots varied about two-fold per needle area, about four-fold per needle dry mass, and about five-fold per nitrogen content. Comparison of fertilized and control stands of Norway spruce indicated that light interception efficiency is not greatly affected by fertilization. Light scattering. Structure of coniferous shoots gives rise to multiple scattering of light between the needles of the shoot. Using geometric models of shoots, multiple scattering was studied by photon tracing simulations. Based on simulation results, the dependence of the scattering coefficient of shoot from the scattering coefficient of needles is shown to follow a simple one-parameter model. The single parameter, termed the recollision probability, describes the level of clumping of the needles in the shoot, is wavelength independent, and can be connected to previously used clumping indices. By using the recollision probability to correct for the within-shoot multiple scattering, canopy radiative transfer models which have used leaves as basic elements can use shoots as basic elements, and thus be applied for coniferous forests. Preliminary testing of this approach seems to explain, at least partially, why coniferous forests appear darker than broadleaved forests in satellite data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nucleation is the first step in a phase transition where small nuclei of the new phase start appearing in the metastable old phase, such as the appearance of small liquid clusters in a supersaturated vapor. Nucleation is important in various industrial and natural processes, including atmospheric new particle formation: between 20 % to 80 % of atmospheric particle concentration is due to nucleation. These atmospheric aerosol particles have a significant effect both on climate and human health. Different simulation methods are often applied when studying things that are difficult or even impossible to measure, or when trying to distinguish between the merits of various theoretical approaches. Such simulation methods include, among others, molecular dynamics and Monte Carlo simulations. In this work molecular dynamics simulations of the homogeneous nucleation of Lennard-Jones argon have been performed. Homogeneous means that the nucleation does not occur on a pre-existing surface. The simulations include runs where the starting configuration is a supersaturated vapor and the nucleation event is observed during the simulation (direct simulations), as well as simulations of a cluster in equilibrium with a surrounding vapor (indirect simulations). The latter type are a necessity when the conditions prevent the occurrence of a nucleation event in a reasonable timeframe in the direct simulations. The effect of various temperature control schemes on the nucleation rate (the rate of appearance of clusters that are equally able to grow to macroscopic sizes and to evaporate) was studied and found to be relatively small. The method to extract the nucleation rate was also found to be of minor importance. The cluster sizes from direct and indirect simulations were used in conjunction with the nucleation theorem to calculate formation free energies for the clusters in the indirect simulations. The results agreed with density functional theory, but were higher than values from Monte Carlo simulations. The formation energies were also used to calculate surface tension for the clusters. The sizes of the clusters in the direct and indirect simulations were compared, showing that the direct simulation clusters have more atoms between the liquid-like core of the cluster and the surrounding vapor. Finally, the performance of various nucleation theories in predicting simulated nucleation rates was investigated, and the results among other things highlighted once again the inadequacy of the classical nucleation theory that is commonly employed in nucleation studies.