651 resultados para cost estimating testing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND The workgroup of Traffic Psychology is concerned with the social, behavioral, and perceptual aspects that are associated with use and non-use of bicycle helmets, in their various forms and under various cycling conditions. OBJECTIVES The objectives of WG2 are to (1) share current knowledge among the people already working in the field, (2) suggest new ideas for research on and evaluation of the design of bicycle helmets, and (3) discuss options for funding of such research within the individual frameworks of the participants. Areas for research include 3.1. The patterns of use of helmets among different users: children, adults, and sports enthusiasts. 3.2. The use of helmets in different environments: rural roads, urban streets, and bike trails. 3.3. Concerns bicyclists have relative to their safety and convenience and the perceived impact of using helmets on comfort and convenience. 3.4. The benefit of helmets for enhancing visibility, and how variations in helmet design and colors affect daytime, nighttime, and dusktime visibility. 3.5. The role of helmets in the acceptance of city-wide pickup-and-drop-off bicycles. 3.6. The impact of helmets on visual search behaviour of bicyclists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective National guidelines for management of intermediate risk patients with suspected acute coronary syndrome, in whom AMI has been excluded, advocate provocative testing to final risk stratify these patients into low risk (negative testing) or high risk (positive testing suggestive of unstable angina). Adults less than 40 years have a low pretest probability of acute coronary syndrome. The utility of exercise stress testing in young adults with chest pain suspected of acute coronary syndrome who have National Heart Foundation intermediate risk features was evaluated Methods A retrospective analysis of exercise stress testing performed on patients less than 40 years was evaluated. Patients were enrolled on a chest pain pathway and had negative serial ECGs and cardiac biomarkers before exercise stress testing to rule-out acute coronary syndrome. Chart review was completed on patients with positive stress tests. Results The 3987 patients with suspected intermediate risk acute coronary syndrome underwent exercise stress testing. One thousand and twenty-seven (25.8%) were aged less than 40 years (age 33.3 ± 4.8 years). Four of these 1027 patients had a positive exercise stress test (0.4% incidence of positive exercise stress testing). Of those, three patients had subsequent non-invasive functional testing that yielded a negative result. One patient declined further investigations. Assuming this was a true positive exercise stress test, the incidence of true positive exercise stress testing would have been 0.097% (95% confidence interval: 0.079–0.115%) (one of 1027 patients). Conclusions Routine exercise stress testing has limited value in the risk stratification of adults less than 40 years with suspected intermediate risk of acute coronary syndrome

Relevância:

20.00% 20.00%

Publicador:

Resumo:

RFID is an important technology that can be used to create the ubiquitous society. But an RFID system uses open radio frequency signal to transfer information and this leads to pose many serious threats to its privacy and security. In general, the computing and storage resources in an RFID tag are very limited and this makes it difficult to solve its secure and private problems, especially for low-cost RFID tags. In order to ensure the security and privacy of low-cost RFID systems we propose a lightweight authentication protocol based on Hash function. This protocol can ensure forward security and prevent information leakage, location tracing, eavesdropping, replay attack and spoofing. This protocol completes the strong authentication of the reader to the tag by twice authenticating and it only transfers part information of the encrypted tag’s identifier for each session so it is difficult for an adversary to intercept the whole identifier of a tag. This protocol is simple and it takes less computing and storage resources, it is very suitable to some low-cost RFID systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The measurement of illicit drug metabolites in raw wastewater is increasingly being adopted as an approach to objectively monitor population-level drug use, and is an effective complement to traditional epidemiological methods. As such, it has been widely applied in western countries. In this study, we utilised this approach to assess drug use patterns over nine days during April 2011 in Hong Kong. Raw wastewater samples were collected from the largest wastewater treatment plant serving a community of approximately 3.5 million people and analysed for excreted drug residues including cocaine, ketamine, methamphetamine, 3,4-methylenedioxymethamphetamine (MDMA) and key metabolites using liquid chromatography coupled with tandem mass spectrometry. The overall drug use pattern determined by wastewater analysis was consistent with that have seen amongst people coming into contact with services in relation to substance use; among our target drugs, ketamine (estimated consumption: 1400–1600 mg/day/1000 people) was the predominant drug followed by methamphetamine (180–200 mg/day/1000 people), cocaine (160–180 mg/day/1000 people) and MDMA (not detected). The levels of these drugs were relatively steady throughout the monitoring period. Analysing samples at higher temporal resolution provided data on diurnal variations of drug residue loads. Elevated ratios of cocaine to benzoylecgonine were identified unexpectedly in three samples during the evening and night, providing evidence for potential dumping events of cocaine. This study provides the first application of wastewater analysis to quantitatively evaluate daily drug use in an Asian metropolitan community. Our data reinforces the benefit of wastewater monitoring to health and law enforcement authorities for strategic planning and evaluation of drug intervention strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To develop a child victimization survey among a diverse group of child protection experts and examine the performance of the instrument through a set of international pilot studies. Methods The initial draft of the instrument was developed after input from scientists and practitioners representing 40 countries. Volunteers from the larger group of scientists participating in the Delphi review of the ICAST P and R reviewed the ICAST C by email in 2 rounds resulting in a final instrument. The ICAST C was then translated and back translated into six languages and field tested in four countries using a convenience sample of 571 children 12–17 years of age selected from schools and classrooms to which the investigators had easy access. Results The final ICAST C Home has 38 items and the ICAST C Institution has 44 items. These items serve as screeners and positive endorsements are followed by queries for frequency and perpetrator. Half of respondents were boys (49%). Endorsement for various forms of victimization ranged from 0 to 51%. Many children report violence exposure (51%), physical victimization (55%), psychological victimization (66%), sexual victimization (18%), and neglect in their homes (37%) in the last year. High rates of physical victimization (57%), psychological victimization (59%), and sexual victimization (22%) were also reported in schools in the last year. Internal consistency was moderate to high (alpha between .685 and .855) and missing data low (less than 1.5% for all but one item). Conclusions In pilot testing, the ICAST C identifies high rates of child victimization in all domains. Rates of missing data are low, and internal consistency is moderate to high. Pilot testing demonstrated the feasibility of using child self-report as one strategy to assess child victimization. Practice implications The ICAST C is a multi-national, multi-lingual, consensus-based survey instrument. It is available in six languages for international research to estimate child victimization. Assessing the prevalence of child victimization is critical in understanding the scope of the problem, setting national and local priorities, and garnering support for program and policy development aimed at child protection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Alterations in energy expenditure during activity post head injury has not been investigated due primarily to the difficulty of measurement. Objective: The aim of this study was to compare energy expenditure during activity and body composition of children following acquired brain injury (ABI) with data from a group of normal controls. Design: Energy expenditure was measured using the Cosmed K4b2 in a group of 15 children with ABI and a group of 67 normal children during rest and when walking and running. Mean number of steps taken per 3 min run was also recorded and body composition was measured. Results: The energy expended during walking was not significantly different between both groups. A significant difference was found between the two groups in the energy expended during running and also for the number of steps taken as children with ABI took significantly less steps than the normal controls during a 3 min run. Conclusions: Children with ABI exert more energy per activity than healthy controls when controlled for velocity or distance. However, they expend less energy to walk and run when they are free to choose their own desirable, comfortable pace than normal controls. © 2003 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. To assess the cost-effectiveness of bone density screening programmes for osteoporosis. Study design. Using published and locally available data regarding fracture rates and treatment costs, the overall costs per fracture prevented, cost per quality of life year (QALY) saved and cost per year of life gained were estimated for different bone density screening and osteoporosis treatment programmes. Main outcome measures. Cost per fracture prevented, cost per QALY saved, and cost per year of life gained. Results. In women over the age of 50 years, the costs per fracture prevented of treating all women with hormone replacement therapy, or treating only if osteoporosis is demonstrated on bone density screening were £32,594 or £23,867 respectively. For alendronate therapy for the same groups, the costs were £171,067 and £14,067 respectively. Once the background rate of treatment with alendronate reaches 18%, bone density screening becomes cost-saving. Costs estimates per QALY saved ranged from £1,514 to £39,076 for osteoporosis treatment with alendronate following bone density screening. Conclusions. For relatively expensive medications such as alendronate, treatment programmes with prior bone density screening are far more cost effective than those without, and in some circumstances become cost-saving. Costs per QALY of life saved and per year of life gained for osteoporosis treatment with prior bone density screening compare favourably with treatment of hypertension and hypercholesterolemia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes a maximum likelihood method for estimating the parameters of the standard square-root stochastic volatility model and a variant of the model that includes jumps in equity prices. The model is fitted to data on the S&P 500 Index and the prices of vanilla options written on the index, for the period 1990 to 2011. The method is able to estimate both the parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options), including the volatility and jump risk premia. The estimation is implemented using a particle filter whose efficacy is demonstrated under simulation. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using graphics processing units (GPUs). The empirical results indicate that the parameters of the models are reliably estimated and consistent with values reported in previous work. In particular, both the volatility risk premium and the jump risk premium are found to be significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to examine the changes of energy cost during a high-heeled continuous jogging.Thirteen healthy female volunteers jointed in this study with heel height of the shoes varied from 1, 4.5 and 7 cm, respectively. Each subjects jogged on the treadmill with K4b2 portable gas analysis system. The results of this study showed that ventilnation, relative oxygen consumption and energy expenditure increased with the increase of heel height and these values shows significantly larger when the heel height reached to 7 cm. Present study suggest that wearing high heel shoes jogging could directly increase energy consumption, causing neuromuscular fatigue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Risky single occasion drinking (RSOD; 4 or more drinks in <6 h) more than doubles the risk of injury in young people (15 - 25 years). The potential role of smartphone apps in reducing RSOD in young people is yet to be explored. Objective: To describe the initial prototype testing of ‘Ray's Night Out’, a new iPhone app targeting RSOD in young people. Method Quantitative and qualitative methods were used to evaluate the quality, perceived utility, and acceptability of the app among nine young people (19e23 years). Results Participants reported Ray's Night Out had good to excellent levels of functionality and visual appeal, acceptable to good levels of entertainment, interest and information, and acceptable levels of customization and interactivity. Young people thought the app had high levels of youth appeal, would prompt users to think about their alcohol use limits, but was unlikely to motivate a change in alcohol use in its current form. Qualitative data provided several suggestions for improving the app. Conclusion Following revision, Ray's Night Out could provide an effective intervention for RSOD in non help-seeking young people. A randomized controlled trial is currently underway to test the final prototype of the app.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To discuss generalized estimating equations as an extension of generalized linear models by commenting on the paper of Ziegler and Vens "Generalized Estimating Equations. Notes on the Choice of the Working Correlation Matrix". Methods Inviting an international group of experts to comment on this paper. Results Several perspectives have been taken by the discussants. Econometricians have established parallels to the generalized method of moments (GMM). Statisticians discussed model assumptions and the aspect of missing data Applied statisticians; commented on practical aspects in data analysis. Conclusions In general, careful modeling correlation is encouraged when considering estimation efficiency and other implications, and a comparison of choosing instruments in GMM and generalized estimating equations, (GEE) would be worthwhile. Some theoretical drawbacks of GEE need to be further addressed and require careful analysis of data This particularly applies to the situation when data are missing at random.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sampling strategies are developed based on the idea of ranked set sampling (RSS) to increase efficiency and therefore to reduce the cost of sampling in fishery research. The RSS incorporates information on concomitant variables that are correlated with the variable of interest in the selection of samples. For example, estimating a monitoring survey abundance index would be more efficient if the sampling sites were selected based on the information from previous surveys or catch rates of the fishery. We use two practical fishery examples to demonstrate the approach: site selection for a fishery-independent monitoring survey in the Australian northern prawn fishery (NPF) and fish age prediction by simple linear regression modelling a short-lived tropical clupeoid. The relative efficiencies of the new designs were derived analytically and compared with the traditional simple random sampling (SRS). Optimal sampling schemes were measured by different optimality criteria. For the NPF monitoring survey, the efficiency in terms of variance or mean squared errors of the estimated mean abundance index ranged from 114 to 199% compared with the SRS. In the case of a fish ageing study for Tenualosa ilisha in Bangladesh, the efficiency of age prediction from fish body weight reached 140%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are numerous load estimation methods available, some of which are captured in various online tools. However, most estimators are subject to large biases statistically, and their associated uncertainties are often not reported. This makes interpretation difficult and the estimation of trends or determination of optimal sampling regimes impossible to assess. In this paper, we first propose two indices for measuring the extent of sampling bias, and then provide steps for obtaining reliable load estimates by minimizing the biases and making use of possible predictive variables. The load estimation procedure can be summarized by the following four steps: - (i) output the flow rates at regular time intervals (e.g. 10 minutes) using a time series model that captures all the peak flows; - (ii) output the predicted flow rates as in (i) at the concentration sampling times, if the corresponding flow rates are not collected; - (iii) establish a predictive model for the concentration data, which incorporates all possible predictor variables and output the predicted concentrations at the regular time intervals as in (i), and; - (iv) obtain the sum of all the products of the predicted flow and the predicted concentration over the regular time intervals to represent an estimate of the load. The key step to this approach is in the development of an appropriate predictive model for concentration. This is achieved using a generalized regression (rating-curve) approach with additional predictors that capture unique features in the flow data, namely the concept of the first flush, the location of the event on the hydrograph (e.g. rise or fall) and cumulative discounted flow. The latter may be thought of as a measure of constituent exhaustion occurring during flood events. The model also has the capacity to accommodate autocorrelation in model errors which are the result of intensive sampling during floods. Incorporating this additional information can significantly improve the predictability of concentration, and ultimately the precision with which the pollutant load is estimated. We also provide a measure of the standard error of the load estimate which incorporates model, spatial and/or temporal errors. This method also has the capacity to incorporate measurement error incurred through the sampling of flow. We illustrate this approach using the concentrations of total suspended sediment (TSS) and nitrogen oxide (NOx) and gauged flow data from the Burdekin River, a catchment delivering to the Great Barrier Reef. The sampling biases for NOx concentrations range from 2 to 10 times indicating severe biases. As we expect, the traditional average and extrapolation methods produce much higher estimates than those when bias in sampling is taken into account.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.