983 resultados para MAGNITUDE
Resumo:
Sleep disturbance after mild traumatic brain injury (mTBI) is commonly reported as debilitating and persistent. However, the nature of this disturbance is poorly understood. This study sought to characterize sleep after mTBI compared with a control group. A cross-sectional matched case control design was used. Thirty-three persons with recent mTBI (1–6 months ago) and 33 age, sex, and ethnicity matched controls completed established questionnaires of sleep quality, quantity, timing, and sleep-related daytime impairment. The mTBI participants were compared with an independent sample of close-matched controls (CMCs; n=33) to allow partial internal replication. Compared with controls, persons with mTBI reported significantly greater sleep disturbance, more severe insomnia symptoms, a longer duration of wake after sleep onset, and greater sleep-related impairment (all medium to large effects, Cohen's d>0.5). No differences were found in sleep quantity, timing, sleep onset latency, sleep efficiency, or daytime sleepiness. All findings except a measure of sleep timing (i.e., sleep midpoint) were replicated for CMCs. These results indicate a difference in the magnitude and nature of perceived sleep disturbance after mTBI compared with controls, where persons with mTBI report poorer sleep quality and greater sleep-related impairment. Sleep quantity and timing did not differ between the groups. These preliminary findings should guide the provision of clearer advice to patients about the aspects of their sleep that may change after mTBI and could inform treatment selection.
Resumo:
Diabetic macular edema (DME) is one of the most common causes of visual loss among diabetes mellitus patients. Early detection and successive treatment may improve the visual acuity. DME is mainly graded into non-clinically significant macular edema (NCSME) and clinically significant macular edema according to the location of hard exudates in the macula region. DME can be identified by manual examination of fundus images. It is laborious and resource intensive. Hence, in this work, automated grading of DME is proposed using higher-order spectra (HOS) of Radon transform projections of the fundus images. We have used third-order cumulants and bispectrum magnitude, in this work, as features, and compared their performance. They can capture subtle changes in the fundus image. Spectral regression discriminant analysis (SRDA) reduces feature dimension, and minimum redundancy maximum relevance method is used to rank the significant SRDA components. Ranked features are fed to various supervised classifiers, viz. Naive Bayes, AdaBoost and support vector machine, to discriminate No DME, NCSME and clinically significant macular edema classes. The performance of our system is evaluated using the publicly available MESSIDOR dataset (300 images) and also verified with a local dataset (300 images). Our results show that HOS cumulants and bispectrum magnitude obtained an average accuracy of 95.56 and 94.39 % for MESSIDOR dataset and 95.93 and 93.33 % for local dataset, respectively.
Resumo:
This paper presents visual detection and classification of light vehicles and personnel on a mine site.We capitalise on the rapid advances of ConvNet based object recognition but highlight that a naive black box approach results in a significant number of false positives. In particular, the lack of domain specific training data and the unique landscape in a mine site causes a high rate of errors. We exploit the abundance of background-only images to train a k-means classifier to complement the ConvNet. Furthermore, localisation of objects of interest and a reduction in computation is enabled through region proposals. Our system is tested on over 10km of real mine site data and we were able to detect both light vehicles and personnel. We show that the introduction of our background model can reduce the false positive rate by an order of magnitude.
Resumo:
This article describes research conducted for the Japanese government in the wake of the magnitude 9.0 earthquake and tsunami that struck eastern Japan on March 11, 2011. In this study, material stock analysis (MSA) is used to examine the losses of building and infrastructure materials after this disaster. Estimates of the magnitude of material stock that has lost its social function as a result of a disaster can indicate the quantities required for reconstruction, help garner a better understanding of the volumes of waste flows generated by that disaster, and also help in the course of policy deliberations in the recovery of disaster-stricken areas. Calculations of the lost building and road materials in the five prefectures most affected were undertaken. Analysis in this study is based on the use of geographical information systems (GIS) databases and statistics; it aims to (1) describe in spatial terms what construction materials were lost, (2) estimate the amount of infrastructure material needed to rehabilitate disaster areas, and (3) indicate the amount of lost material stock that should be taken into consideration during government policy deliberations. Our analysis concludes that the material stock losses of buildings and road infrastructure are 31.8 and 2.1 million tonnes, respectively. This research approach and the use of spatial MSA can be useful for urban planners and may also convey more appropriate information about disposal based on the work of municipalities in disaster-afflicted areas.
Resumo:
BACKGROUND Quantification of the disease burden caused by different risks informs prevention by providing an account of health loss different to that provided by a disease-by-disease analysis. No complete revision of global disease burden caused by risk factors has been done since a comparative risk assessment in 2000, and no previous analysis has assessed changes in burden attributable to risk factors over time. METHODS We estimated deaths and disability-adjusted life years (DALYs; sum of years lived with disability [YLD] and years of life lost [YLL]) attributable to the independent effects of 67 risk factors and clusters of risk factors for 21 regions in 1990 and 2010. We estimated exposure distributions for each year, region, sex, and age group, and relative risks per unit of exposure by systematically reviewing and synthesising published and unpublished data. We used these estimates, together with estimates of cause-specific deaths and DALYs from the Global Burden of Disease Study 2010, to calculate the burden attributable to each risk factor exposure compared with the theoretical-minimum-risk exposure. We incorporated uncertainty in disease burden, relative risks, and exposures into our estimates of attributable burden. FINDINGS In 2010, the three leading risk factors for global disease burden were high blood pressure (7·0% [95% uncertainty interval 6·2-7·7] of global DALYs), tobacco smoking including second-hand smoke (6·3% [5·5-7·0]), and alcohol use (5·5% [5·0-5·9]). In 1990, the leading risks were childhood underweight (7·9% [6·8-9·4]), household air pollution from solid fuels (HAP; 7·0% [5·6-8·3]), and tobacco smoking including second-hand smoke (6·1% [5·4-6·8]). Dietary risk factors and physical inactivity collectively accounted for 10·0% (95% UI 9·2-10·8) of global DALYs in 2010, with the most prominent dietary risks being diets low in fruits and those high in sodium. Several risks that primarily affect childhood communicable diseases, including unimproved water and sanitation and childhood micronutrient deficiencies, fell in rank between 1990 and 2010, with unimproved water and sanitation accounting for 0·9% (0·4-1·6) of global DALYs in 2010. However, in most of sub-Saharan Africa childhood underweight, HAP, and non-exclusive and discontinued breastfeeding were the leading risks in 2010, while HAP was the leading risk in south Asia. The leading risk factor in Eastern Europe, most of Latin America, and southern sub-Saharan Africa in 2010 was alcohol use; in most of Asia, North Africa and Middle East, and central Europe it was high blood pressure. Despite declines, tobacco smoking including second-hand smoke remained the leading risk in high-income north America and western Europe. High body-mass index has increased globally and it is the leading risk in Australasia and southern Latin America, and also ranks high in other high-income regions, North Africa and Middle East, and Oceania. INTERPRETATION Worldwide, the contribution of different risk factors to disease burden has changed substantially, with a shift away from risks for communicable diseases in children towards those for non-communicable diseases in adults. These changes are related to the ageing population, decreased mortality among children younger than 5 years, changes in cause-of-death composition, and changes in risk factor exposures. New evidence has led to changes in the magnitude of key risks including unimproved water and sanitation, vitamin A and zinc deficiencies, and ambient particulate matter pollution. The extent to which the epidemiological shift has occurred and what the leading risks currently are varies greatly across regions. In much of sub-Saharan Africa, the leading risks are still those associated with poverty and those that affect children.
Resumo:
Although tactical voting attracts a great deal of attention, it is very hard to measure as it requires knowledge of both individuals’ voting choices as well as their unobserved preferences. In this article, we present a simple empirical strategy to nonparametrically identify tactical voting patterns directly from balloting results. This approach allows us to study the magnitude and direction of strategic voting as well as to verify which information voters and parties take into account to determine marginal constituencies. We show that tactical voting played a significant role in the 2010 election, mainly for Liberal–Democratic voters supporting Labour. Moreover, our results suggest that voters seem to form their expectations based on a national swing in vote shares rather than newspaper guides published in the main media outlets or previous election outcomes. We also present some evidence that suggests that campaign spending is not driving tactical voting.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
The main aim of the present study was to estimate size segregated doses from e-cigarette aerosols as a function of the airway generation number in lung lobes.. After a 2-second puff, 7.7×1010 particles (DTot) with a surface area of 3.6×103 mm2 (STot), and 3.3×1010 particles with a surface area of 4.2×103 mm2 were deposited in the respiratory system for the electronic and conventional cigarettes, respectively. Alveolar and tracheobronchial deposited doses were compared to the ones received by non-smoking individuals in Western countries, showing a similar order of magnitude. Total regional doses (DR), in head and lobar tracheobronchial and alveolar regions, ranged from 2.7×109 to 1.3×1010 particles and 1.1×109 to 5.3×1010 particles, for the electronic and conventional cigarettes, respectively. DR in the right-upper lung lobe was about twice that found in left-upper lobe and 20% greater in right-lower lobe than the left-lower lobe.
Resumo:
We explore the impact of delisting on the performance of the momentum trading strategy in Australia. We employ a new dataset of hand-collected delisting returns for all Australian stocks and provide the first study outside the U.S. to jointly examine the effects of delisting and missing returns on the magnitude of momentum profits. In the sample of all stocks, we find that the profitability of momentum strategies depends crucially on the returns of delisted stocks, especiallyon bankrupt firms. In the sample of large stocks, however, the momentum effect remains strong after controlling for the effect of delisted stocks, in contrast to the U.S. evidence in which delisting returns can explain 40% of momentum profits. As these large stocks are less exposed to liquidity risks, the momentum effect in Australia is even more puzzling than in the U.S.
Resumo:
The efficient computation of matrix function vector products has become an important area of research in recent times, driven in particular by two important applications: the numerical solution of fractional partial differential equations and the integration of large systems of ordinary differential equations. In this work we consider a problem that combines these two applications, in the form of a numerical solution algorithm for fractional reaction diffusion equations that after spatial discretisation, is advanced in time using the exponential Euler method. We focus on the efficient implementation of the algorithm on Graphics Processing Units (GPU), as we wish to make use of the increased computational power available with this hardware. We compute the matrix function vector products using the contour integration method in [N. Hale, N. Higham, and L. Trefethen. Computing Aα, log(A), and related matrix functions by contour integrals. SIAM J. Numer. Anal., 46(5):2505–2523, 2008]. Multiple levels of preconditioning are applied to reduce the GPU memory footprint and to further accelerate convergence. We also derive an error bound for the convergence of the contour integral method that allows us to pre-determine the appropriate number of quadrature points. Results are presented that demonstrate the effectiveness of the method for large two-dimensional problems, showing a speedup of more than an order of magnitude compared to a CPU-only implementation.
Resumo:
This thesis presents the design process and the prototyping of a lightweight, modular robotic vehicle for the sustainable intensification of broadacre agriculture. Achieved by the joint operation of multiple autonomous vehicles to improve energy consumption, reduce labour, and increase efficiency in the application of inputs for the management of crops. The Small Robotic Farm Vehicle (SRFV) is a lightweight and energy efficient robotic vehicle with a configurable, modular design. It is capable of undertaking a range of agricultural tasks, including fertilising and weed management through mechanical intervention and precision spraying, whilst being more than an order of magnitude lower in weight than existing broadacre agricultural equipment.
Resumo:
Study design Retrospective validation study. Objectives To propose a method to evaluate, from a clinical standpoint, the ability of a finite-element model (FEM) of the trunk to simulate orthotic correction of spinal deformity and to apply it to validate a previously described FEM. Summary of background data Several FEMs of the scoliotic spine have been described in the literature. These models can prove useful in understanding the mechanisms of scoliosis progression and in optimizing its treatment, but their validation has often been lacking or incomplete. Methods Three-dimensional (3D) geometries of 10 patients before and during conservative treatment were reconstructed from biplanar radiographs. The effect of bracing was simulated by modeling displacements induced by the brace pads. Simulated clinical indices (Cobb angle, T1–T12 and T4–T12 kyphosis, L1–L5 lordosis, apical vertebral rotation, torsion, rib hump) and vertebral orientations and positions were compared to those measured in the patients' 3D geometries. Results Errors in clinical indices were of the same order of magnitude as the uncertainties due to 3D reconstruction; for instance, Cobb angle was simulated with a root mean square error of 5.7°, and rib hump error was 5.6°. Vertebral orientation was simulated with a root mean square error of 4.8° and vertebral position with an error of 2.5 mm. Conclusions The methodology proposed here allowed in-depth evaluation of subject-specific simulations, confirming that FEMs of the trunk have the potential to accurately simulate brace action. These promising results provide a basis for ongoing 3D model development, toward the design of more efficient orthoses.
Resumo:
In estuaries and natural water channels, the estimate of velocity and dispersion coefficients is critical to the knowledge of scalar transport and mixing. This estimate is rarely available experimentally at sub-tidal time scale in shallow water channels where high frequency is required to capture its spatio-temporal variation. This study estimates Lagrangian integral scales and autocorrelation curves, which are key parameters for obtaining velocity fluctuations and dispersion coefficients, and their spatio-temporal variability from deployments of Lagrangian drifters sampled at 10 Hz for a 4-hour period. The power spectral densities of the velocities between 0.0001 and 0.8 Hz were well fitted with a slope of 5/3 predicted by Kolmogorov’s similarity hypothesis within the inertial subrange, and were similar to the Eulerian power spectral previously observed within the estuary. The result showed that large velocity fluctuations determine the magnitude of the integral time scale, TL. Overlapping of short segments improved the stability of the estimate of TL by taking advantage of the redundant data included in the autocorrelation function. The integral time scales were about 20 s and varied by up to a factor of 8. These results are essential inputs for spatial binning of velocities, Lagrangian stochastic modelling and single particle analysis of the tidal estuary.
Resumo:
This paper asks the question to what scale and speed does society need to reduce its ecological footprint and improve resource productivity to prevent further overshoot and return within the ecological limits of the earth’s ecological life support systems? How fast do these changes need to be achieved? The paper shows that now a large range of studies find that engineering sustainable solutions need to be roughly an order or magnitude resource productivity improvement (sometimes called a Factor of 10, or a 90% reduction) by 2050 to achieve real and lasting ecological sustainability. This marks a significant challenge for engineers – indeed all designers and architects, where best practice in engineering sustainable solutions will need to achieve large resource productivity targets. The paper brings together examples of best practice in achieving these large targets from around the world. The paper also highlights key resources and texts for engineers who wish to learn how to do it. But engineers need to be realistic and patient. Significant barriers exist to achieving Factor 4-10 such as the fact that infrastructure and technology rollover and replacement is often slow. This slow rollover of the built environment and technology is the context within which most engineers work, making the goal of achieving Factor 10 all the more challenging. However, the paper demonstrates that by using best practice in engineering sustainable solutions and by addressing the necessary market, information and institutional failures it is possible to achieve Factor 10 over the next 50 years. This paper draws on recent publications by The Natural Edge Project (TNEP) and partners, including Hargroves, K. Smith, M. (Eds) (2005) The Natural Advantage of Nations: Business Opportunities, Innovation and Governance for the 21st Century, and the TNEP Engineering Sustainable Solutions Program - Critical Literacies for Engineers Portfolio. Both projects have the significant support of Engineers Australia. its College of Environmental Engineers and the Society of Sustainability and Environmental Engineering.
Resumo:
Background The diagnosis of frailty is based on physical impairments and clinicians have indicated that early detection is one of the most effective methods for reducing the severity of physical frailty. Maybe, an alternative to the classical diagnosis could be the instrumentalization of classical functional testing, as Romberg test or Timed Get Up and Go Test. The aim of this study was (I) to measure and describe the magnitude of accelerometry values in the Romberg test in two groups of frail and non-frail elderly people through instrumentation with the iPhone 4®, (II) to analyse the performances and differences between the study groups, and (III) to analyse the performances and differences within study groups to characterise accelerometer responses to increasingly difficult challenges to balance. Methods This is a cross-sectional study of 18 subjects over 70 years old, 9 frail subjects and 9 non-frail subjects. The non-parametric Mann–Whitney U test was used for between-group comparisons in means values derived from different tasks. The Wilcoxon Signed-Rank test was used to analyse differences between different variants of the test in both independent study groups. Results The highest difference between groups was found in the accelerometer values with eyes closed and feet parallel: maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.01). Subjects with eyes open and feet parallel, greatest differences found between the groups were in the maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.001). With eyes closed and feet in tandem, the greatest differences found between the groups were in the minimum peak acceleration in the lateral axis (p < 0.01). Conclusions The accelerometer fitted in the iPhone 4® is able to study and analyse the kinematics of the Romberg test between frail and non-frail elderly people. In addition, the results indicate that the accelerometry values also were significantly different between the frail and non-frail groups, and that values from the accelerometer accelerometer increased as the test was made more complicated.