22 resultados para 21-point running mean
Resumo:
Power has become a key constraint in current nanoscale integrated circuit design due to the increasing demands for mobile computing and a low carbon economy. As an emerging technology, an inexact circuit design offers a promising approach to significantly reduce both dynamic and static power dissipation for error tolerant applications. Although fixed-point arithmetic circuits have been studied in terms of inexact computing, floating-point arithmetic circuits have not been fully considered although require more power. In this paper, the first inexact floating-point adder is designed and applied to high dynamic range (HDR) image processing. Inexact floating-point adders are proposed by approximately designing an exponent subtractor and mantissa adder. Related logic operations including normalization and rounding modules are also considered in terms of inexact computing. Two HDR images are processed using the proposed inexact floating-point adders to show the validity of the inexact design. HDR-VDP is used as a metric to measure the subjective results of the image addition. Significant improvements have been achieved in terms of area, delay and power consumption. Comparison results show that the proposed inexact floating-point adders can improve power consumption and the power-delay product by 29.98% and 39.60%, respectively.
Resumo:
BACKGROUND: Antibiotic dosing in neonates varies between countries and centres, suggesting suboptimal exposures for some neonates. We aimed to describe variations and factors influencing the variability in the dosing of frequently used antibiotics in European NICUs to help define strategies for improvement.
METHODS: A sub-analysis of the European Study of Neonatal Exposure to Excipients point prevalence study was undertaken. Demographic data of neonates receiving any antibiotic on the study day within one of three two-week periods from January to June 2012, the dose, dosing interval and route of administration of each prescription were recorded. The British National Formulary for Children (BNFC) and Neofax were used as reference sources. Risk factors for deviations exceeding ±25% of the relevant BNFC dosage recommendation were identified by multivariate logistic regression analysis.
RESULTS: In 89 NICUs from 21 countries, 586 antibiotic prescriptions for 342 infants were reported. The twelve most frequently used antibiotics - gentamicin, penicillin G, ampicillin, vancomycin, amikacin, cefotaxime, ceftazidime, meropenem, amoxicillin, metronidazole, teicoplanin and flucloxacillin - covered 92% of systemic prescriptions. Glycopeptide class, GA <32 weeks, 5(th) minute Apgar score <5 and geographical region were associated with deviation from the BNFC dosage recommendation. While the doses of penicillins exceeded recommendations, antibiotics with safety concerns followed (gentamicin) or were dosed below (vancomycin) recommendations.
CONCLUSIONS: The current lack of compliance with existing dosing recommendations for neonates needs to be overcome through the conduct of well-designed clinical trials with a limited number of antibiotics to define pharmacokinetics/pharmacodynamics, efficacy and safety in this population and by efficient dissemination of the results.
Resumo:
The Magellanic Clouds are uniquely placed to study the stellar contribution to dust emission. Individual stars can be resolved in these systems even in the mid-infrared, and they are close enough to allow detection of infrared excess caused by dust. We have searched the Spitzer Space Telescope data archive for all Infrared Spectrograph (IRS) staring-mode observations of the Small Magellanic Cloud (SMC) and found that 209 Infrared Array Camera (IRAC) point sources within the footprint of the Surveying the Agents of Galaxy Evolution in the Small Magellanic Cloud (SAGE-SMC) Spitzer Legacy programme were targeted, within a total of 311 staring-mode observations. We classify these point sources using a decision tree method of object classification, based on infrared spectral features, continuum and spectral energy distribution shape, bolometric luminosity, cluster membership and variability information. We find 58 asymptotic giant branch (AGB) stars, 51 young stellar objects, 4 post-AGB objects, 22 red supergiants, 27 stars (of which 23 are dusty OB stars), 24 planetary nebulae (PNe), 10 Wolf-Rayet stars, 3 H II regions, 3 R Coronae Borealis stars, 1 Blue Supergiant and 6 other objects, including 2 foreground AGB stars. We use these classifications to evaluate the success of photometric classification methods reported in the literature.
Resumo:
Purpose
To evaluate the impact of the position of an asymmetric multifocal near segment on visual quality.
Setting
Cathedral Eye Clinic, Belfast, United Kingdom.
Design
Retrospective comparative case series.
Methods
Data from consecutive patients who had bilateral implantation of the Lentis Mplus LS-312 multifocal intraocular lens were divided into 2 groups. One group received inferonasal near-segment placement and the other, superotemporal near-segment placement. A +3.00 diopter (D) reading addition (add) was used in all eyes. The main outcome measures included uncorrected distance visual acuity (UDVA), uncorrected near visual acuity (UNVA), contrast sensitivity, and quality of vision. Follow-up was 3 months.
Results
Patients ranged in age from 43 to 76 years. The inferonasal group comprised 80 eyes (40 patients) and the superotemporal group, 76 eyes (38 patients). The mean 3-month spherical equivalent was −0.11 D ± 0.49 (SD) in the inferonasal group and −0.18 ± 0.46 D in the superotemporal group. The mean postoperative UDVA was 0.14 ± 0.10 logMAR and 0.18 ± 0.15 logMAR, respectively. The mean monocular UNVA was 0.21 ± 0.14 logRAD and 0.24 ± 0.13 logRAD, respectively. No significant differences were observed in the higher-order aberrations, total Strehl ratio (point-spread function), or modulation transfer function between the groups. Dysphotopic symptoms measured with a validated quality-of-vision questionnaire were not significantly different between groups.
Conclusion
Positioning of the near add did not significantly affect objective or subjective visual function parameters.
Resumo:
BACKGROUND: The neonatal and pediatric antimicrobial point prevalence survey (PPS) of the Antibiotic Resistance and Prescribing in European Children project (http://www.arpecproject.eu/) aims to standardize a method for surveillance of antimicrobial use in children and neonates admitted to the hospital within Europe. This article describes the audit criteria used and reports overall country-specific proportions of antimicrobial use. An analytical review presents methodologies on antimicrobial use.
METHODS: A 1-day PPS on antimicrobial use in hospitalized children was organized in September 2011, using a previously validated and standardized method. The survey included all inpatient pediatric and neonatal beds and identified all children receiving an antimicrobial treatment on the day of survey. Mandatory data were age, gender, (birth) weight, underlying diagnosis, antimicrobial agent, dose and indication for treatment. Data were entered through a web-based system for data-entry and reporting, based on the WebPPS program developed for the European Surveillance of Antimicrobial Consumption project.
RESULTS: There were 2760 and 1565 pediatric versus 1154 and 589 neonatal inpatients reported among 50 European (n = 14 countries) and 23 non-European hospitals (n = 9 countries), respectively. Overall, antibiotic pediatric and neonatal use was significantly higher in non-European (43.8%; 95% confidence interval [CI]: 41.3-46.3% and 39.4%; 95% CI: 35.5-43.4%) compared with that in European hospitals (35.4; 95% CI: 33.6-37.2% and 21.8%; 95% CI: 19.4-24.2%). Proportions of antibiotic use were highest in hematology/oncology wards (61.3%; 95% CI: 56.2-66.4%) and pediatric intensive care units (55.8%; 95% CI: 50.3-61.3%).
CONCLUSIONS: An Antibiotic Resistance and Prescribing in European Children standardized web-based method for a 1-day PPS was successfully developed and conducted in 73 hospitals worldwide. It offers a simple, feasible and sustainable way of data collection that can be used globally.
Resumo:
Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.
Resumo:
Objectives
Barefoot running describes when individuals run without footwear. Minimalist running utilizes shoes aimed to mimic being barefoot. Although these forms of running have become increasingly popular, we still know little about how recreational runners perceive them.
Design
In-depth interviews with eight recreational runners were used to gather information about their running experiences with a focus on barefoot and minimalist running.
Methods
Interviews were analysed using a latent level thematic analysis to identify and interpret themes within the data.
Results
Although participants considered barefoot running to be ‘natural’, they also considered it to be extreme. Minimalist running did not produce such aversive reactions. ‘Support’ reassured against concerns and was seen as central in protecting vulnerable body parts and reducing impact forces, but lacked a common or clear definition. A preference for practical over academic knowledge was found. Anecdotal information was generally trusted, as were running stores with gait assessment, but not health professionals.
Conclusion
People often have inconsistent ideas about barefoot and minimalist running, which are often formed by potentially biased sources, which may lead people to make poor decisions about barefoot and minimalist running. It is important to provide high-quality information to enable better decisions to be made about barefoot and minimalist running.
Statement of contribution
What is already known on this subject?
There is no known work on the psychology behind barefoot and minimalist running. We believe our study is the first qualitative study to have investigated views of this increasingly popular form of running.
What does this study add?
The results suggest that although barefoot running is considered ‘natural’, it is also considered ‘extreme’. Minimalist running, however, did not receive such aversive reactions.
‘Support’ was a common concern among runners. Although ‘support’ reassured against concerns and was seen as central in protecting vulnerable body parts and reducing impact forces, it lacked a common or clear definition.
A preference for practical over academic knowledge was found. Anecdotal information was generally trusted, as were running stores with gait assessment, but not health professionals.