915 resultados para Springer briefs
Resumo:
Summary This systematic review demonstrates that vitamin D supplementation does not have a significant effect on muscle strength in vitamin D replete adults. However, a limited number of studies demonstrate an increase in proximal muscle strength in adults with vitamin D deficiency. Introduction The purpose of this study is to systematically review the evidence on the effect of vitamin D supplementation on muscle strength in adults. Methods A comprehensive systematic database search was performed. Inclusion criteria included randomised controlled trials (RCTs) involving adult human participants. All forms and doses of vitamin D supplementation with or without calcium supplementation were included compared with placebo or standard care. Outcome measures included evaluation of strength. Outcomes were compared by calculating standardised mean difference (SMD) and 95% confidence intervals. Results Of 52 identified studies, 17 RCTs involving 5,072 participants met the inclusion criteria. Meta-analysis showed no significant effect of vitamin D supplementation on grip strength (SMD −0.02, 95%CI −0.15,0.11) or proximal lower limb strength (SMD 0.1, 95%CI −0.01,0.22) in adults with 25(OH)D levels >25 nmol/L. Pooled data from two studies in vitamin D deficient participants (25(OH)D <25 nmol/L) demonstrated a large effect of vitamin D supplementation on hip muscle strength (SMD 3.52, 95%CI 2.18, 4.85). Conclusion Based on studies included in this systematic review, vitamin D supplementation does not have a significant effect on muscle strength in adults with baseline 25(OH)D >25 nmol/L. However, a limited number of studies demonstrate an increase in proximal muscle strength in adults with vitamin D deficiency. Keywords Muscle – Muscle fibre – Strength – Vitamin D
Resumo:
Early detection surveillance programs aim to find invasions of exotic plant pests and diseases before they are too widespread to eradicate. However, the value of these programs can be difficult to justify when no positive detections are made. To demonstrate the value of pest absence information provided by these programs, we use a hierarchical Bayesian framework to model estimates of incursion extent with and without surveillance. A model for the latent invasion process provides the baseline against which surveillance data are assessed. Ecological knowledge and pest management criteria are introduced into the model using informative priors for invasion parameters. Observation models assimilate information from spatio-temporal presence/absence data to accommodate imperfect detection and generate posterior estimates of pest extent. When applied to an early detection program operating in Queensland, Australia, the framework demonstrates that this typical surveillance regime provides a modest reduction in the estimate that a surveyed district is infested. More importantly, the model suggests that early detection surveillance programs can provide a dramatic reduction in the putative area of incursion and therefore offer a substantial benefit to incursion management. By mapping spatial estimates of the point probability of infestation, the model identifies where future surveillance resources can be most effectively deployed.
Resumo:
In this work a novel hybrid approach is presented that uses a combination of both time domain and frequency domain solution strategies to predict the power distribution within a lossy medium loaded within a waveguide. The problem of determining the electromagnetic fields evolving within the waveguide and the lossy medium is decoupled into two components, one for computing the fields in the waveguide including a coarse representation of the medium (the exterior problem) and one for a detailed resolution of the lossy medium (the interior problem). A previously documented cell-centred Maxwell’s equations numerical solver can be used to resolve the exterior problem accurately in the time domain. Thereafter the discrete Fourier transform can be applied to the computed field data around the interface of the medium to estimate the frequency domain boundary condition in-formation that is needed for closure of the interior problem. Since only the electric fields are required to compute the power distribution generated within the lossy medium, the interior problem can be resolved efficiently using the Helmholtz equation. A consistent cell-centred finite-volume method is then used to discretise this equation on a fine mesh and the underlying large, sparse, complex matrix system is solved for the required electric field using the iterative Krylov subspace based GMRES iterative solver. It will be shown that the hybrid solution methodology works well when a single frequency is considered in the evaluation of the Helmholtz equation in a single mode waveguide. A restriction of the scheme is that the material needs to be sufficiently lossy, so that any penetrating waves in the material are absorbed.
Resumo:
An initialisation process is a key component in modern stream cipher design. A well-designed initialisation process should ensure that each key-IV pair generates a different key stream. In this paper, we analyse two ciphers, A5/1 and Mixer, for which this does not happen due to state convergence. We show how the state convergence problem occurs and estimate the effective key-space in each case.
Resumo:
In this paper I discuss a recent exchange of articles between Hugh McLachlan and John Coggon on the relationship between omissions, causation and moral responsibility. My aim is to contribute to their debate by isolating a presupposition I believe they both share, and by questioning that presupposition. The presupposition is that, at any given moment, there are countless things that I am omitting to do. This leads them both to give a distorted account of the relationship between causation and moral or (as the case may be) legal responsibility, and, in the case of Coggon, to claim that the law’s conception of causation is a fiction based on policy. Once it is seen that this presupposition is faulty, we can attain a more accurate view of the logical relationship between causation and moral responsibility in the case of omissions. This is important because it will enable us, in turn, to understand why the law continues to regard omissions as different, both logically and morally, from acts, and why the law seeks to track that logical and moral difference in the legal distinction it draws between withholding life-sustaining measures and euthanasia.
Resumo:
We present a mass-conservative vertex-centred finite volume method for efficiently solving the mixed form of Richards’ equation in heterogeneous porous media. The spatial discretisation is particularly well-suited to heterogeneous media because it produces consistent flux approximations at quadrature points where material properties are continuous. Combined with the method of lines, the spatial discretisation gives a set of differential algebraic equations amenable to solution using higher-order implicit solvers. We investigate the solution of the mixed form using a Jacobian-free inexact Newton solver, which requires the solution of an extra variable for each node in the mesh compared to the pressure-head form. By exploiting the structure of the Jacobian for the mixed form, the size of the preconditioner is reduced to that for the pressure-head form, and there is minimal computational overhead for solving the mixed form. The proposed formulation is tested on two challenging test problems. The solutions from the new formulation offer conservation of mass at least one order of magnitude more accurate than a pressure head formulation, and the higher-order temporal integration significantly improves both the mass balance and computational efficiency of the solution.
Resumo:
Expert elicitation is the process of determining what expert knowledge is relevant to support a quantitative analysis and then eliciting this information in a form that supports analysis or decision-making. The credibility of the overall analysis, therefore, relies on the credibility of the elicited knowledge. This, in turn, is determined by the rigor of the design and execution of the elicitation methodology, as well as by its clear communication to ensure transparency and repeatability. It is difficult to establish rigor when the elicitation methods are not documented, as often occurs in ecological research. In this chapter, we describe software that can be combined with a well-structured elicitation process to improve the rigor of expert elicitation and documentation of the results
Resumo:
In this editorial letter, we provide the readers of Information Systems and e-Business Management with an introduction to Business Process Management and the challenges of empirical research in this field. We then briefly describe selected examples of current research efforts in this fields and how the papers accepted for this special issue contribute to extending our body of knowledge.
Resumo:
This panel discusses the impact of Green IT on information systems and how information systems can meet environmental challenges and ensure sustainability. We wish to highlight the role of green business processes, and specifically the contributions that the management of these processes can play in leveraging the transformative power of IS in order to create an environmentally sustainable society. The management of business processes has typically been thought of in terms of business improvement alongside the dimensions time, cost, quality, or flexibility – the so-called ‘devil’s quadrangle’. Contemporary organizations, however, increasingly become aware of the need to create more sustainable, IT-enabled business processes that are also successful in terms of their economic, ecological, as well as social impact. Exemplary ecological key performance indicators that increasingly find their way into the agenda of managers include carbon emissions, data center energy, or renewable energy consumption (SAP 2010). The key challenge, therefore, is to extend the devil’s quadrangle to a devil’s pentagon, including sustainability as an important fifth dimension in process change.
Resumo:
Signal Processing (SP) is a subject of central importance in engineering and the applied sciences. Signals are information-bearing functions, and SP deals with the analysis and processing of signals (by dedicated systems) to extract or modify information. Signal processing is necessary because signals normally contain information that is not readily usable or understandable, or which might be disturbed by unwanted sources such as noise. Although many signals are non-electrical, it is common to convert them into electrical signals for processing. Most natural signals (such as acoustic and biomedical signals) are continuous functions of time, with these signals being referred to as analog signals. Prior to the onset of digital computers, Analog Signal Processing (ASP) and analog systems were the only tool to deal with analog signals. Although ASP and analog systems are still widely used, Digital Signal Processing (DSP) and digital systems are attracting more attention, due in large part to the significant advantages of digital systems over the analog counterparts. These advantages include superiority in performance,s peed, reliability, efficiency of storage, size and cost. In addition, DSP can solve problems that cannot be solved using ASP, like the spectral analysis of multicomonent signals, adaptive filtering, and operations at very low frequencies. Following the recent developments in engineering which occurred in the 1980's and 1990's, DSP became one of the world's fastest growing industries. Since that time DSP has not only impacted on traditional areas of electrical engineering, but has had far reaching effects on other domains that deal with information such as economics, meteorology, seismology, bioengineering, oceanology, communications, astronomy, radar engineering, control engineering and various other applications. This book is based on the Lecture Notes of Associate Professor Zahir M. Hussain at RMIT University (Melbourne, 2001-2009), the research of Dr. Amin Z. Sadik (at QUT & RMIT, 2005-2008), and the Note of Professor Peter O'Shea at Queensland University of Technology. Part I of the book addresses the representation of analog and digital signals and systems in the time domain and in the frequency domain. The core topics covered are convolution, transforms (Fourier, Laplace, Z. Discrete-time Fourier, and Discrete Fourier), filters, and random signal analysis. There is also a treatment of some important applications of DSP, including signal detection in noise, radar range estimation, banking and financial applications, and audio effects production. Design and implementation of digital systems (such as integrators, differentiators, resonators and oscillators are also considered, along with the design of conventional digital filters. Part I is suitable for an elementary course in DSP. Part II (which is suitable for an advanced signal processing course), considers selected signal processing systems and techniques. Core topics covered are the Hilbert transformer, binary signal transmission, phase-locked loops, sigma-delta modulation, noise shaping, quantization, adaptive filters, and non-stationary signal analysis. Part III presents some selected advanced DSP topics. We hope that this book will contribute to the advancement of engineering education and that it will serve as a general reference book on digital signal processing.
Resumo:
Operation in urban environments creates unique challenges for research in autonomous ground vehicles. Due to the presence of tall trees and buildings in close proximity to traversable areas, GPS outage is likely to be frequent and physical hazards pose real threats to autonomous systems. In this paper, we describe a novel autonomous platform developed by the Sydney-Berkeley Driving Team for entry into the 2007 DARPA Urban Challenge competition. We report empirical results analyzing the performance of the vehicle while navigating a 560-meter test loop multiple times in an actual urban setting with severe GPS outage. We show that our system is robust against failure of global position estimates and can reliably traverse standard two-lane road networks using vision for localization. Finally, we discuss ongoing efforts in fusing vision data with other sensing modalities.
Resumo:
Human facial expression is a complex process characterized of dynamic, subtle and regional emotional features. State-of-the-art approaches on facial expression recognition (FER) have not fully utilized this kind of features to improve the recognition performance. This paper proposes an approach to overcome this limitation using patch-based ‘salient’ Gabor features. A set of 3D patches are extracted to represent the subtle and regional features, and then inputted into patch matching operations for capturing the dynamic features. Experimental results show a significant performance improvement of the proposed approach due to the use of the dynamic features. Performance comparison with pervious work also confirms that the proposed approach achieves the highest CRR reported to date on the JAFFE database and a top-level performance on the Cohn-Kanade (CK) database.