588 resultados para CHANDRASEKHAR MASS MODELS
Resumo:
In the present study, we examined the associations of early nutrition with adult lean body mass (LBM) and muscle strength in a birth cohort that was established to assess the long-term impact of a nutrition program. Participants (n = 1,446, 32% female) were born near Hyderabad, India, in 29 villages from 1987 to 1990, during which time only intervention villages (n = 15) had a government program that offered balanced protein-calorie supplementation to pregnant women and children. Participants’ LBM and appendicular skeletal muscle mass were measured using dual energy x-ray absorptiometry; grip strength and information on lifestyle indicators, including diet and physical activity level, were also obtained. Ages (mean = 20.3 years) and body mass indexes (weight (kg)/height (m)2; mean = 19.5) of participants in 2 groups were similar. Current dietary energy intake was higher in the intervention group. Unadjusted LBM and grip strength were similar in 2 groups. After adjustment for potential confounders, the intervention group had lower LBM (β = −0.75; P = 0.03), appendicular skeletal muscle mass, and grip strength than did controls, but these differences were small in magnitude (<0.1 standard deviation). Multivariable regression analyses showed that current socioeconomic position, energy intake, and physical activity level had a positive association with adult LBM and muscle strength. This study could not detect a “programming” effect of early nutrition supplementation on adult LBM and muscle strength.
Resumo:
Many newspapers and magazines have added “social media features” to their web-based information services in order to allow users to participate in the production of content. This study examines the specific impact of the firm’s investment in social media features on their online business models. We make a comparative case study of four Scandinavian print media firms that have added social media features to their online services. We show how social media features lead to online business model innovation, particularly linked to the firms’ value propositions. The paper discusses the repercussions of this transformation on firms’ relationship with consumers and with traditional content contributors. The modified value proposition also requires firms to acquire new competences in order to reap full benefit of their social media investments. We show that the firms have been unable to do so since they have not allowed the social media features to affect their online revenue models.
Resumo:
A typology of music distribution models is proposed consisting of the ownership model, the access model, and the context model. These models are not substitutes for each other and may co‐exist serving different market niches. The paper argues that increasingly the economic value created from recorded music is based on con‐text rather than on ownership. During this process, access‐based services temporarily generate economic value, but such services are destined to eventually become commoditised.
Resumo:
The ability of poly(acrylic acid) (PAA) with different end groups and molar masses prepared by Atom Transfer Radical Polymerization (ATRP) to inhibit the formation of calcium carbonate scale at low and elevated temperatures was investigated. Inhibition of CaCO3 deposition was affected by the hydrophobicity of the end groups of PAA, with the greatest inhibition seen for PAA with hydrophobic end groups of moderate size (6–10 carbons). The morphologies of CaCO3 crystals were significantly distorted in the presence of these PAAs. The smallest morphological change was in the presence of PAA with long hydrophobic end groups (16 carbons) and the relative inhibition observed for all species were in the same order at 30 °C and 100 °C. As well as distorting morphologies, the scale inhibitors appeared to stabilize the less thermodynamically favorable polymorph, vaterite, to a degree proportional to their ability to inhibit precipitation.
Resumo:
Whole image descriptors have recently been shown to be remarkably robust to perceptual change especially compared to local features. However, whole-image-based localization systems typically rely on heuristic methods for determining appropriate matching thresholds in a particular environment. These environment-specific tuning requirements and the lack of a meaningful interpretation of these arbitrary thresholds limits the general applicability of these systems. In this paper we present a Bayesian model of probability for whole-image descriptors that can be seamlessly integrated into localization systems designed for probabilistic visual input. We demonstrate this method using CAT-Graph, an appearance-based visual localization system originally designed for a FAB-MAP-style probabilistic input. We show that using whole-image descriptors as visual input extends CAT-Graph’s functionality to environments that experience a greater amount of perceptual change. We also present a method of estimating whole-image probability models in an online manner, removing the need for a prior training phase. We show that this online, automated training method can perform comparably to pre-trained, manually tuned local descriptor methods.
Resumo:
The acceptance of broadband ultrasound attenuation for the assessment of osteoporosis suffers from a limited understanding of ultrasound wave propagation through cancellous bone. It has recently been proposed that the ultrasound wave propagation can be described by a concept of parallel sonic rays. This concept approximates the detected transmission signal to be the superposition of all sonic rays that travel directly from transmitting to receiving transducer. The transit time of each ray is defined by the proportion of bone and marrow propagated. An ultrasound transit time spectrum describes the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit times over the surface of the receiving ultrasound transducer. The aim of this study was to provide a proof of concept that a transit time spectrum may be derived from digital deconvolution of input and output ultrasound signals. We have applied the active-set method deconvolution algorithm to determine the ultrasound transit time spectra in the three orthogonal directions of four cancellous bone replica samples and have compared experimental data with the prediction from the computer simulation. The agreement between experimental and predicted ultrasound transit time spectrum analyses derived from Bland–Altman analysis ranged from 92% to 99%, thereby supporting the concept of parallel sonic rays for ultrasound propagation in cancellous bone. In addition to further validation of the parallel sonic ray concept, this technique offers the opportunity to consider quantitative characterisation of the material and structural properties of cancellous bone, not previously available utilising ultrasound.
Resumo:
In this paper an approach is presented for identification of a reduced model for coherent areas in power systems using phasor measurement units to represent the inter-area oscillations of the system. The generators which are coherent in a wide range of operating conditions form the areas in power systems and the reduced model is obtained by representing each area by an equivalent machine. The reduced nonlinear model is then identified based on the data obtained from measurement units. The simulation is performed on three test systems and the obtained results show high accuracy of identification process.
Resumo:
Exact solutions of partial differential equation models describing the transport and decay of single and coupled multispecies problems can provide insight into the fate and transport of solutes in saturated aquifers. Most previous analytical solutions are based on integral transform techniques, meaning that the initial condition is restricted in the sense that the choice of initial condition has an important impact on whether or not the inverse transform can be calculated exactly. In this work we describe and implement a technique that produces exact solutions for single and multispecies reactive transport problems with more general, smooth initial conditions. We achieve this by using a different method to invert a Laplace transform which produces a power series solution. To demonstrate the utility of this technique, we apply it to two example problems with initial conditions that cannot be solved exactly using traditional transform techniques.
Resumo:
Many model-based investigation techniques, such as sensitivity analysis, optimization, and statistical inference, require a large number of model evaluations to be performed at different input and/or parameter values. This limits the application of these techniques to models that can be implemented in computationally efficient computer codes. Emulators, by providing efficient interpolation between outputs of deterministic simulation models, can considerably extend the field of applicability of such computationally demanding techniques. So far, the dominant techniques for developing emulators have been priors in the form of Gaussian stochastic processes (GASP) that were conditioned with a design data set of inputs and corresponding model outputs. In the context of dynamic models, this approach has two essential disadvantages: (i) these emulators do not consider our knowledge of the structure of the model, and (ii) they run into numerical difficulties if there are a large number of closely spaced input points as is often the case in the time dimension of dynamic models. To address both of these problems, a new concept of developing emulators for dynamic models is proposed. This concept is based on a prior that combines a simplified linear state space model of the temporal evolution of the dynamic model with Gaussian stochastic processes for the innovation terms as functions of model parameters and/or inputs. These innovation terms are intended to correct the error of the linear model at each output step. Conditioning this prior to the design data set is done by Kalman smoothing. This leads to an efficient emulator that, due to the consideration of our knowledge about dominant mechanisms built into the simulation model, can be expected to outperform purely statistical emulators at least in cases in which the design data set is small. The feasibility and potential difficulties of the proposed approach are demonstrated by the application to a simple hydrological model.
Resumo:
A predictive model of terrorist activity is developed by examining the daily number of terrorist attacks in Indonesia from 1994 through 2007. The dynamic model employs a shot noise process to explain the self-exciting nature of the terrorist activities. This estimates the probability of future attacks as a function of the times since the past attacks. In addition, the excess of nonattack days coupled with the presence of multiple coordinated attacks on the same day compelled the use of hurdle models to jointly model the probability of an attack day and corresponding number of attacks. A power law distribution with a shot noise driven parameter best modeled the number of attacks on an attack day. Interpretation of the model parameters is discussed and predictive performance of the models is evaluated.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
Recent expansion in research in the field of lipidomics has been driven by the development of new mass spectrometric tools and protocols for the identification and quantification of molecular lipids in complex matrices. Although there are similarities between the field of lipidomics and the allied field of mass spectrometry (e.g., proteomics), lipids present some unique advantages and challenges for mass spectrometric analysis. The application of electrospray ionization to crude lipid extracts without prior fractionation-the so-called shotgun approach-is one such example, as it has perhaps been more successfully applied in lipidomics than in any other discipline. Conversely, the diverse molecular structure of lipids means that collision-induced dissociation alone may be limited in providing unique descriptions of complex lipid structures, and the development of additional, complementary tools for ion activation and analysis is required to overcome these challenges. In this article, we discuss the state of the art in lipid mass spectrometry and highlight several areas in which current approaches are deficient and further innovation is required.
Resumo:
The deposition of biological material (biofouling) onto polymeric contact lenses is thought to be a major contributor to lens discomfort and hence discontinuation of wear. We describe a method to characterize lipid deposits directly from worn contact lenses utilizing liquid extraction surface analysis coupled to tandem mass spectrometry (LESA-MS/MS). This technique effected facile and reproducible extraction of lipids from the contact lens surfaces and identified lipid molecular species representing all major classes present in human tear film. Our data show that LESA-MS/MS is a rapid and comprehensive technique for the characterization of lipid-related biofouling on polymer surfaces.
Resumo:
Mass spectrometry is now an indispensable tool for lipid analysis and is arguably the driving force in the renaissance of lipid research. In its various forms, mass spectrometry is uniquely capable of resolving the extensive compositional and structural diversity of lipids in biological systems. Furthermore, it provides the ability to accurately quantify molecular-level changes in lipid populations associated with changes in metabolism and environment; bringing lipid science to the "omics" age. The recent explosion of mass spectrometry-based surface analysis techniques is fuelling further expansion of the lipidomics field. This is evidenced by the numerous papers published on the subject of mass spectrometric imaging of lipids in recent years. While imaging mass spectrometry provides new and exciting possibilities, it is but one of the many opportunities direct surface analysis offers the lipid researcher. In this review we describe the current state-of-the-art in the direct surface analysis of lipids with a focus on tissue sections, intact cells and thin-layer chromatography substrates. The suitability of these different approaches towards analysis of the major lipid classes along with their current and potential applications in the field of lipid analysis are evaluated. © 2013 Elsevier Ltd. All rights reserved.