974 resultados para Computing cost
Resumo:
BACKGROUND The workgroup of Traffic Psychology is concerned with the social, behavioral, and perceptual aspects that are associated with use and non-use of bicycle helmets, in their various forms and under various cycling conditions. OBJECTIVES The objectives of WG2 are to (1) share current knowledge among the people already working in the field, (2) suggest new ideas for research on and evaluation of the design of bicycle helmets, and (3) discuss options for funding of such research within the individual frameworks of the participants. Areas for research include 3.1. The patterns of use of helmets among different users: children, adults, and sports enthusiasts. 3.2. The use of helmets in different environments: rural roads, urban streets, and bike trails. 3.3. Concerns bicyclists have relative to their safety and convenience and the perceived impact of using helmets on comfort and convenience. 3.4. The benefit of helmets for enhancing visibility, and how variations in helmet design and colors affect daytime, nighttime, and dusktime visibility. 3.5. The role of helmets in the acceptance of city-wide pickup-and-drop-off bicycles. 3.6. The impact of helmets on visual search behaviour of bicyclists.
Resumo:
Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.
Resumo:
Adopting a multi-theoretical approach, I examine external auditors’ perceptions of the reasons why organizations do or do not adopt cloud computing. I interview forensic accountants and IT experts about the adoption, acceptance, institutional motives, and risks of cloud computing. Although the medium to large accounting firms where the external auditors worked almost exclusively used private clouds, both private and public cloud services were gaining a foothold among many of their clients. Despite the advantages of cloud computing, data confidentiality and the involvement of foreign jurisdictions remain a concern, particularly if the data are moved outside Australia. Additionally, some organizations seem to understand neither the technology itself nor their own requirements, which may lead to poorly negotiated contracts and service agreements. To minimize the risks associated with cloud computing, many organizations turn to hybrid solutions or private clouds that include national or dedicated data centers. To the best of my knowledge, this is the first empirical study that reports on cloud computing adoption from the perspectives of external auditors.
Resumo:
Background: Alterations in energy expenditure during activity post head injury has not been investigated due primarily to the difficulty of measurement. Objective: The aim of this study was to compare energy expenditure during activity and body composition of children following acquired brain injury (ABI) with data from a group of normal controls. Design: Energy expenditure was measured using the Cosmed K4b2 in a group of 15 children with ABI and a group of 67 normal children during rest and when walking and running. Mean number of steps taken per 3 min run was also recorded and body composition was measured. Results: The energy expended during walking was not significantly different between both groups. A significant difference was found between the two groups in the energy expended during running and also for the number of steps taken as children with ABI took significantly less steps than the normal controls during a 3 min run. Conclusions: Children with ABI exert more energy per activity than healthy controls when controlled for velocity or distance. However, they expend less energy to walk and run when they are free to choose their own desirable, comfortable pace than normal controls. © 2003 Elsevier Ltd. All rights reserved.
Resumo:
Objective. To assess the cost-effectiveness of bone density screening programmes for osteoporosis. Study design. Using published and locally available data regarding fracture rates and treatment costs, the overall costs per fracture prevented, cost per quality of life year (QALY) saved and cost per year of life gained were estimated for different bone density screening and osteoporosis treatment programmes. Main outcome measures. Cost per fracture prevented, cost per QALY saved, and cost per year of life gained. Results. In women over the age of 50 years, the costs per fracture prevented of treating all women with hormone replacement therapy, or treating only if osteoporosis is demonstrated on bone density screening were £32,594 or £23,867 respectively. For alendronate therapy for the same groups, the costs were £171,067 and £14,067 respectively. Once the background rate of treatment with alendronate reaches 18%, bone density screening becomes cost-saving. Costs estimates per QALY saved ranged from £1,514 to £39,076 for osteoporosis treatment with alendronate following bone density screening. Conclusions. For relatively expensive medications such as alendronate, treatment programmes with prior bone density screening are far more cost effective than those without, and in some circumstances become cost-saving. Costs per QALY of life saved and per year of life gained for osteoporosis treatment with prior bone density screening compare favourably with treatment of hypertension and hypercholesterolemia.
Resumo:
In this paper, we present a decentralized dynamic load scheduling/balancing algorithm called ELISA (Estimated Load Information Scheduling Algorithm) for general purpose distributed computing systems. ELISA uses estimated state information based upon periodic exchange of exact state information between neighbouring nodes to perform load scheduling. The primary objective of the algorithm is to cut down on the communication and load transfer overheads by minimizing the frequency of status exchange and by restricting the load transfer and status exchange within the buddy set of a processor. It is shown that the resulting algorithm performs almost as well as a perfect information algorithm and is superior to other load balancing schemes based on the random sharing and Ni-Hwang algorithms. A sensitivity analysis to study the effect of various design parameters on the effectiveness of load balancing is also carried out. Finally, the algorithm's performance is tested on large dimensional hypercubes in the presence of time-varying load arrival process and is shown to perform well in comparison to other algorithms. This makes ELISA a viable and implementable load balancing algorithm for use in general purpose distributed computing systems.
Resumo:
Estimating the economic burden of injuries is important for setting priorities, allocating scarce health resources and planning cost-effective prevention activities. As a metric of burden, costs account for multiple injury consequences—death, severity, disability, body region, nature of injury—in a single unit of measurement. In a 1989 landmark report to the US Congress, Rice et al1 estimated the lifetime costs of injuries in the USA in 1985. By 2000, the epidemiology and burden of injuries had changed enough that the US Congress mandated an update, resulting in a book on the incidence and economic burden of injury in the USA.2 To make these findings more accessible to the larger realm of scientists and practitioners and to provide a template for conducting the same economic burden analyses in other countries and settings, a summary3 was published in Injury Prevention. Corso et al reported that, between 1985 and 2000, injury rates declined roughly 15%. The estimated lifetime cost of these injuries declined 20%, totalling US$406 billion, including US$80 billion in medical costs and US$326 billion in lost productivity. While incidence reflects problem size, the relative burden of injury is better expressed using costs.
Resumo:
The purpose of this study is to examine the changes of energy cost during a high-heeled continuous jogging.Thirteen healthy female volunteers jointed in this study with heel height of the shoes varied from 1, 4.5 and 7 cm, respectively. Each subjects jogged on the treadmill with K4b2 portable gas analysis system. The results of this study showed that ventilnation, relative oxygen consumption and energy expenditure increased with the increase of heel height and these values shows significantly larger when the heel height reached to 7 cm. Present study suggest that wearing high heel shoes jogging could directly increase energy consumption, causing neuromuscular fatigue.
Resumo:
Simultaneous consideration of both performance and reliability issues is important in the choice of computer architectures for real-time aerospace applications. One of the requirements for such a fault-tolerant computer system is the characteristic of graceful degradation. A shared and replicated resources computing system represents such an architecture. In this paper, a combinatorial model is used for the evaluation of the instruction execution rate of a degradable, replicated resources computing system such as a modular multiprocessor system. Next, a method is presented to evaluate the computation reliability of such a system utilizing a reliability graph model and the instruction execution rate. Finally, this computation reliability measure, which simultaneously describes both performance and reliability, is applied as a constraint in an architecture optimization model for such computing systems. Index Terms-Architecture optimization, computation
Resumo:
This paper is aimed at reviewing the notion of Byzantine-resilient distributed computing systems, the relevant protocols and their possible applications as reported in the literature. The three agreement problems, namely, the consensus problem, the interactive consistency problem, and the generals problem have been discussed. Various agreement protocols for the Byzantine generals problem have been summarized in terms of their performance and level of fault-tolerance. The three classes of Byzantine agreement protocols discussed are the deterministic, randomized, and approximate agreement protocols. Finally, application of the Byzantine agreement protocols to clock synchronization is highlighted.
Resumo:
Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971) considered optimal set size for ranked set sampling (RSS) with fixed operational costs. This framework can be very useful in practice to determine whether RSS is beneficial and to obtain the optimal set size that minimizes the variance of the population estimator for a fixed total cost. In this article, we propose a scheme of general RSS in which more than one observation can be taken from each ranked set. This is shown to be more cost-effective in some cases when the cost of ranking is not so small. We demonstrate using the example in Nahhas, Wolfe, and Chen (2002, Biometrics 58, 964-971), by taking two or more observations from one set even with the optimal set size from the RSS design can be more beneficial.
Resumo:
Recent decreases in costs, and improvements in performance, of silicon array detectors open a range of potential applications of relevance to plant physiologists, associated with spectral analysis in the visible and short-wave near infra-red (far-red) spectrum. The performance characteristics of three commercially available ‘miniature’ spectrometers based on silicon array detectors operating in the 650–1050-nm spectral region (MMS1 from Zeiss, S2000 from Ocean Optics, and FICS from Oriel, operated with a Larry detector) were compared with respect to the application of non-invasive prediction of sugar content of fruit using near infra-red spectroscopy (NIRS). The FICS–Larry gave the best wavelength resolution; however, the narrow slit and small pixel size of the charge-coupled device detector resulted in a very low sensitivity, and this instrumentation was not considered further. Wavelength resolution was poor with the MMS1 relative to the S2000 (e.g. full width at half maximum of the 912 nm Hg peak, 13 and 2 nm for the MMS1 and S2000, respectively), but the large pixel height of the array used in the MMS1 gave it sensitivity comparable to the S2000. The signal-to-signal standard error ratio of spectra was greater by an order of magnitude with the MMS1, relative to the S2000, at both near saturation and low light levels. Calibrations were developed using reflectance spectra of filter paper soaked in range of concentrations (0–20% w/v) of sucrose, using a modified partial least squares procedure. Calibrations developed with the MMS1 were superior to those developed using the S2000 (e.g. coefficient of correlation of 0.90 and 0.62, and standard error of cross-validation of 1.9 and 5.4%, respectively), indicating the importance of high signal to noise ratio over wavelength resolution to calibration accuracy. The design of a bench top assembly using the MMS1 for the non-invasive assessment of mesocarp sugar content of (intact) melon fruit is reported in terms of light source and angle between detector and light source, and optimisation of math treatment (derivative condition and smoothing function).
Resumo:
Demagnetization to zero remanent value or to a predetermined value is of interest to magnet manufacturers and material users. Conventional methods of demagnetization using a varying alternating demagnetizing field, under a damped oscillatory or conveyor system, result in either high cost for demagnetization or large power dissipation. A simple technique using thyristors is presented for demagnetizing the material. Power consumption is mainly in the first two half-cycles of applied voltage. Hence power dissipation is very much reduced. An optimum value calculation for a thyristor triggering angle for demagnetizing high coercive materials is also presented.
Resumo:
The Reeb graph tracks topology changes in level sets of a scalar function and finds applications in scientific visualization and geometric modeling. We describe an algorithm that constructs the Reeb graph of a Morse function defined on a 3-manifold. Our algorithm maintains connected components of the two dimensional levels sets as a dynamic graph and constructs the Reeb graph in O(nlogn+nlogg(loglogg)3) time, where n is the number of triangles in the tetrahedral mesh representing the 3-manifold and g is the maximum genus over all level sets of the function. We extend this algorithm to construct Reeb graphs of d-manifolds in O(nlogn(loglogn)3) time, where n is the number of triangles in the simplicial complex that represents the d-manifold. Our result is a significant improvement over the previously known O(n2) algorithm. Finally, we present experimental results of our implementation and demonstrate that our algorithm for 3-manifolds performs efficiently in practice.
Resumo:
There’s a polyester mullet skirt gracing a derrière near you. It’s short at the front, long at the back, and it’s also known as the hi-lo skirt. Like fads that preceded it, the mullet skirt has a short fashion life, and although it will remain potentially wearable for years, it’s likely to soon be heading to the charity shop or to landfill...