920 resultados para diffusive viscoelastic model, global weak solution, error estimate
Resumo:
Land-use regression (LUR) is a technique that can improve the accuracy of air pollution exposure assessment in epidemiological studies. Most LUR models are developed for single cities, which places limitations on their applicability to other locations. We sought to develop a model to predict nitrogen dioxide (NO2) concentrations with national coverage of Australia by using satellite observations of tropospheric NO2 columns combined with other predictor variables. We used a generalised estimating equation (GEE) model to predict annual and monthly average ambient NO2 concentrations measured by a national monitoring network from 2006 through 2011. The best annual model explained 81% of spatial variation in NO2 (absolute RMS error=1.4 ppb), while the best monthly model explained 76% (absolute RMS error=1.9 ppb). We applied our models to predict NO2 concentrations at the ~350,000 census mesh blocks across the country (a mesh block is the smallest spatial unit in the Australian census). National population-weighted average concentrations ranged from 7.3 ppb (2006) to 6.3 ppb (2011). We found that a simple approach using tropospheric NO2 column data yielded models with slightly better predictive ability than those produced using a more involved approach that required simulation of surface-to-column ratios. The models were capable of capturing within-urban variability in NO2, and offer the ability to estimate ambient NO2 concentrations at monthly and annual time scales across Australia from 2006–2011. We are making our model predictions freely available for research.
Resumo:
Bayesian networks (BNs) are graphical probabilistic models used for reasoning under uncertainty. These models are becoming increasing popular in a range of fields including ecology, computational biology, medical diagnosis, and forensics. In most of these cases, the BNs are quantified using information from experts, or from user opinions. An interest therefore lies in the way in which multiple opinions can be represented and used in a BN. This paper proposes the use of a measurement error model to combine opinions for use in the quantification of a BN. The multiple opinions are treated as a realisation of measurement error and the model uses the posterior probabilities ascribed to each node in the BN which are computed from the prior information given by each expert. The proposed model addresses the issues associated with current methods of combining opinions such as the absence of a coherent probability model, the lack of the conditional independence structure of the BN being maintained, and the provision of only a point estimate for the consensus. The proposed model is applied an existing Bayesian Network and performed well when compared to existing methods of combining opinions.
Resumo:
Background Although the detrimental impact of major depressive disorder (MDD) at the individual level has been described, its global epidemiology remains unclear given limitations in the data. Here we present the modelled epidemiological profile of MDD dealing with heterogeneity in the data, enforcing internal consistency between epidemiological parameters and making estimates for world regions with no empirical data. These estimates were used to quantify the burden of MDD for the Global Burden of Disease Study 2010 (GBD 2010). Method Analyses drew on data from our existing literature review of the epidemiology of MDD. DisMod-MR, the latest version of the generic disease modelling system redesigned as a Bayesian meta-regression tool, derived prevalence by age, year and sex for 21 regions. Prior epidemiological knowledge, study- and country-level covariates adjusted sub-optimal raw data. Results There were over 298 million cases of MDD globally at any point in time in 2010, with the highest proportion of cases occurring between 25 and 34 years. Global point prevalence was very similar across time (4.4% (95% uncertainty: 4.2–4.7%) in 1990, 4.4% (4.1–4.7%) in 2005 and 2010), but higher in females (5.5% (5.0–6.0%) compared to males (3.2% (3.0–3.6%) in 2010. Regions in conflict had higher prevalence than those with no conflict. The annual incidence of an episode of MDD followed a similar age and regional pattern to prevalence but was about one and a half times higher, consistent with an average duration of 37.7 weeks. Conclusion We were able to integrate available data, including those from high quality surveys and sub-optimal studies, into a model adjusting for known methodological sources of heterogeneity. We were also able to estimate the epidemiology of MDD in regions with no available data. This informed GBD 2010 and the public health field, with a clearer understanding of the global distribution of MDD.
Resumo:
A comprehensive revision of the Global Burden of Disease (GBD) study is expected to be completed in 2012. This study utilizes a broad range of improved methods for assessing burden, including closer attention to empirically derived estimates of disability. The aim of this paper is to describe how GBD health states were derived for schizophrenia and bipolar disorder. These will be used in deriving health state-specific disability estimates. A literature review was first conducted to settle on a parsimonious set of health states for schizophrenia and bipolar disorder. A second review was conducted to investigate the proportion of schizophrenia and bipolar disorder cases experiencing these health states. These were pooled using a quality-effects model to estimate the overall proportion of cases in each state. The two schizophrenia health states were acute (predominantly positive symptoms) and residual (predominantly negative symptoms). The three bipolar disorder health states were depressive, manic, and residual. Based on estimates from six studies, 63% (38%-82%) of schizophrenia cases were in an acute state and 37% (18%-62%) were in a residual state. Another six studies were identified from which 23% (10%-39%) of bipolar disorder cases were in a manic state, 27% (11%-47%) were in a depressive state, and 50% (30%-70%) were in a residual state. This literature review revealed salient gaps in the literature that need to be addressed in future research. The pooled estimates are indicative only and more data are required to generate more definitive estimates. That said, rather than deriving burden estimates that fail to capture the changes in disability within schizophrenia and bipolar disorder, the derived proportions and their wide uncertainty intervals will be used in deriving disability estimates.
Resumo:
Background We used data from the Global Burden of Diseases, Injuries, and Risk Factors Study 2010 (GBD 2010) to estimate the burden of disease attributable to mental and substance use disorders in terms of disability-adjusted life years (DALYs), years of life lost to premature mortality (YLLs), and years lived with disability (YLDs). Methods For each of the 20 mental and substance use disorders included in GBD 2010, we systematically reviewed epidemiological data and used a Bayesian meta-regression tool, DisMod-MR, to model prevalence by age, sex, country, region, and year. We obtained disability weights from representative community surveys and an internet-based survey to calculate YLDs. We calculated premature mortality as YLLs from cause of death estimates for 1980–2010 for 20 age groups, both sexes, and 187 countries. We derived DALYs from the sum of YLDs and YLLs. We adjusted burden estimates for comorbidity and present them with 95% uncertainty intervals. Findings In 2010, mental and substance use disorders accounted for 183·9 million DALYs (95% UI 153·5 million–216·7 million), or 7·4% (6·2–8·6) of all DALYs worldwide. Such disorders accounted for 8·6 million YLLs (6·5 million–12·1 million; 0·5% [0·4–0·7] of all YLLs) and 175·3 million YLDs (144·5 million–207·8 million; 22·9% [18·6–27·2] of all YLDs). Mental and substance use disorders were the leading cause of YLDs worldwide. Depressive disorders accounted for 40·5% (31·7–49·2) of DALYs caused by mental and substance use disorders, with anxiety disorders accounting for 14·6% (11·2–18·4), illicit drug use disorders for 10·9% (8·9–13·2), alcohol use disorders for 9·6% (7·7–11·8), schizophrenia for 7·4% (5·0–9·8), bipolar disorder for 7·0% (4·4–10·3), pervasive developmental disorders for 4·2% (3·2–5·3), childhood behavioural disorders for 3·4% (2·2–4·7), and eating disorders for 1·2% (0·9–1·5). DALYs varied by age and sex, with the highest proportion of total DALYs occurring in people aged 10–29 years. The burden of mental and substance use disorders increased by 37·6% between 1990 and 2010, which for most disorders was driven by population growth and ageing. Interpretation Despite the apparently small contribution of YLLs—with deaths in people with mental disorders coded to the physical cause of death and suicide coded to the category of injuries under self-harm—our findings show the striking and growing challenge that these disorders pose for health systems in developed and developing regions. In view of the magnitude of their contribution, improvement in population health is only possible if countries make the prevention and treatment of mental and substance use disorders a public health priority.
Resumo:
This study evaluated the complexity of calcium ion exchange with sodium exchanged weak acid cation resin (DOW MAC-3). Exchange equilibria recorded for a range of different solution normalities revealed profiles which were represented by conventional “L” or “H” type isotherms at low values of equilibrium concentration (Ce) of calcium ions, plus a superimposed region of increasing calcium uptake was observed at high Ce values. The loading of calcium ions was determined to be ca. 53.5 to 58.7 g/kg of resin when modelling only the sorption curve created at low Ce values,which exhibited a well-defined plateau. The calculated calcium ion loading capacity for DOWMAC-3 resin appeared to correlate with the manufacturer's recommendation. The phenomenon of super equivalent ion exchange (SEIX) was observed when the “driving force” for the exchange process was increased in excess of 2.25 mmol calcium ions per gram of resin in the starting solution. This latter event was explained in terms of displacement of sodium ions from sodium hydroxide solution which remained in the resin bead following the initial conversion of the as supplied “H+” exchanged resin sites to the “Na+” version required for softening studies. Evidence for hydrolysis of a small fraction of the sites on the sodium exchanged resin surface was noted. The importance of carefully choosing experimental parameters was discussed especially in relation to application of the Langmuir–Vageler expression. This latter model which compared the ratio of the initial calcium ion concentration in solution to resin mass, versus final equilibrium loading of the calcium ions on the resin; was discovered to be an excellent means of identifying the progress of the calcium–sodium ion exchange process. Moreover, the Langmuir–Vageler model facilitated standardization of various calcium–sodium ion exchange experiments which allowed systematic experimental design.
Resumo:
This special issue of Cultural Science Journal is devoted to the report of a groundbreaking experiment in re-coordinating global markets for specialist scholarly books and enabling the knowledge commons: the Knowledge Unlatched proof-of-concept pilot. The pilot took place between January 2012 and September 2014. It involved libraries, publishers, authors, readers and research funders in the process of developing and testing a global library consortium model for supporting Open Access books. The experiment established that authors, librarians, publishers and research funding agencies can work together in powerful new ways to enable open access; that doing so is cost effective; and that a global library consortium model has the potential dramatically to widen access to the knowledge and ideas contained in book-length scholarly works.
Resumo:
In an estuary, mixing and dispersion result from a combination of large-scale advection and smallscale turbulence, which are complex to estimate. The predictions of scalar transport and mixing are often inferred and rarely accurate, due to inadequate understanding of the contributions of these difference scales to estuarine recirculation. A multi-device field study was conducted in a small sub-tropical estuary under neap tide conditions with near-zero fresh water discharge for about 48 hours. During the study, acoustic Doppler velocimeters (ADV) were sampled at high frequency (50 Hz), while an acoustic Doppler current profiler (ADCP) and global positioning system (GPS) tracked drifters were used to obtain some lower frequency spatial distribution of the flow parameters within the estuary. The velocity measurements were complemented with some continuous measurement of water depth, conductivity, temperature and some other physiochemical parameters. Thorough quality control was carried out by implementation of relevant error removal filters on the individual data set to intercept spurious data. A triple decomposition (TD) technique was introduced to access the contributions of tides, resonance and ‘true’ turbulence in the flow field. The time series of mean flow measurements for both the ADCP and drifter were consistent with those of the mean ADV data when sampled within a similar spatial domain. The tidal scale fluctuation of velocity and water level were used to examine the response of the estuary to tidal inertial current. The channel exhibited a mixed type wave with a typical phase-lag between 0.035π– 0.116π. A striking feature of the ADV velocity data was the slow fluctuations, which exhibited large amplitudes of up to 50% of the tidal amplitude, particularly in slack waters. Such slow fluctuations were simultaneously observed in a number of physiochemical properties of the channel. The ensuing turbulence field showed some degree of anisotropy. For all ADV units, the horizontal turbulence ratio ranged between 0.4 and 0.9, and decreased towards the bed, while the vertical turbulence ratio was on average unity at z = 0.32 m and approximately 0.5 for the upper ADV (z = 0.55 m). The result of the statistical analysis suggested that the ebb phase turbulence field was dominated by eddies that evolved from ejection type process, while that of the flood phase contained mixed eddies with significant amount related to sweep type process. Over 65% of the skewness values fell within the range expected of a finite Gaussian distribution and the bulk of the excess kurtosis values (over 70%) fell within the range of -0.5 and +2. The TD technique described herein allowed the characterisation of a broader temporal scale of fluctuations of the high frequency data sampled within the durations of a few tidal cycles. The study provides characterisation of the ranges of fluctuation required for an accurate modelling of shallow water dispersion and mixing in a sub-tropical estuary.
Resumo:
Background: Studies have examined the effects of temperature on mortality in a single city, country, or region. However, less evidence is available on the variation in the associations between temperature and mortality in multiple countries, analyzed simultaneously. Methods: We obtained daily data on temperature and mortality in 306 communities from 12 countries/regions (Australia, Brazil, Thailand, China, Taiwan, Korea, Japan, Italy, Spain, United Kingdom, United States, and Canada). Two-stage analyses were used to assess the nonlinear and delayed relation between temperature and mortality. In the first stage, a Poisson regression allowing overdispersion with distributed lag nonlinear model was used to estimate the community-specific temperature-mortality relation. In the second stage, a multivariate meta-analysis was used to pool the nonlinear and delayed effects of ambient temperature at the national level, in each country. Results: The temperatures associated with the lowest mortality were around the 75th percentile of temperature in all the countries/regions, ranging from 66th (Taiwan) to 80th (UK) percentiles. The estimated effects of cold and hot temperatures on mortality varied by community and country. Meta-analysis results show that both cold and hot temperatures increased the risk of mortality in all the countries/regions. Cold effects were delayed and lasted for many days, whereas heat effects appeared quickly and did not last long. Conclusions: People have some ability to adapt to their local climate type, but both cold and hot temperatures are still associated with increased risk of mortality. Public health strategies to alleviate the impact of ambient temperatures are important, in particular in the context of climate change.
Resumo:
An important uncertainty when estimating per capita consumption of, for example, illicit drugs by means of wastewater analysis (sometimes referred to as “sewage epidemiology”) relates to the size and variability of the de facto population in the catchment of interest. In the absence of a day-specific direct population count any indirect surrogate model to estimate population size lacks a standard to assess associated uncertainties. Therefore, the objective of this study was to collect wastewater samples at a unique opportunity, that is, on a census day, as a basis for a model to estimate the number of people contributing to a given wastewater sample. Mass loads for a wide range of pharmaceuticals and personal care products were quantified in influents of ten sewage treatment plants (STP) serving populations ranging from approximately 3500 to 500 000 people. Separate linear models for population size were estimated with the mass loads of the different chemical as the explanatory variable: 14 chemicals showed good, linear relationships, with highest correlations for acesulfame and gabapentin. De facto population was then estimated through Bayesian inference, by updating the population size provided by STP staff (prior knowledge) with measured chemical mass loads. Cross validation showed that large populations can be estimated fairly accurately with a few chemical mass loads quantified from 24-h composite samples. In contrast, the prior knowledge for small population sizes cannot be improved substantially despite the information of multiple chemical mass loads. In the future, observations other than chemical mass loads may improve this deficit, since Bayesian inference allows including any kind of information relating to population size.
Resumo:
Background Different from other indicators of cardiac function, such as ejection fraction and transmitral early diastolic velocity, myocardial strain is promising to capture subtle alterations that result from early diseases of the myocardium. In order to extract the left ventricle (LV) myocardial strain and strain rate from cardiac cine-MRI, a modified hierarchical transformation model was proposed. Methods A hierarchical transformation model including the global and local LV deformations was employed to analyze the strain and strain rate of the left ventricle by cine-MRI image registration. The endocardial and epicardial contour information was introduced to enhance the registration accuracy by combining the original hierarchical algorithm with an Iterative Closest Points using Invariant Features algorithm. The hierarchical model was validated by a normal volunteer first and then applied to two clinical cases (i.e., the normal volunteer and a diabetic patient) to evaluate their respective function. Results Based on the two clinical cases, by comparing the displacement fields of two selected landmarks in the normal volunteer, the proposed method showed a better performance than the original or unmodified model. Meanwhile, the comparison of the radial strain between the volunteer and patient demonstrated their apparent functional difference. Conclusions The present method could be used to estimate the LV myocardial strain and strain rate during a cardiac cycle and thus to quantify the analysis of the LV motion function.
Resumo:
The application of multilevel control strategies for load-frequency control of interconnected power systems is assuming importance. A large multiarea power system may be viewed as an interconnection of several lower-order subsystems, with possible change of interconnection pattern during operation. The solution of the control problem involves the design of a set of local optimal controllers for the individual areas, in a completely decentralised environment, plus a global controller to provide the corrective signal to account for interconnection effects. A global controller, based on the least-square-error principle suggested by Siljak and Sundareshan, has been applied for the LFC problem. A more recent work utilises certain possible beneficial aspects of interconnection to permit more desirable system performances. The paper reports the application of the latter strategy to LFC of a two-area power system. The power-system model studied includes the effects of excitation system and governor controls. A comparison of the two strategies is also made.
Resumo:
Recovering the motion of a non-rigid body from a set of monocular images permits the analysis of dynamic scenes in uncontrolled environments. However, the extension of factorisation algorithms for rigid structure from motion to the low-rank non-rigid case has proved challenging. This stems from the comparatively hard problem of finding a linear “corrective transform” which recovers the projection and structure matrices from an ambiguous factorisation. We elucidate that this greater difficulty is due to the need to find multiple solutions to a non-trivial problem, casting a number of previous approaches as alleviating this issue by either a) introducing constraints on the basis, making the problems nonidentical, or b) incorporating heuristics to encourage a diverse set of solutions, making the problems inter-dependent. While it has previously been recognised that finding a single solution to this problem is sufficient to estimate cameras, we show that it is possible to bootstrap this partial solution to find the complete transform in closed-form. However, we acknowledge that our method minimises an algebraic error and is thus inherently sensitive to deviation from the low-rank model. We compare our closed-form solution for non-rigid structure with known cameras to the closed-form solution of Dai et al. [1], which we find to produce only coplanar reconstructions. We therefore make the recommendation that 3D reconstruction error always be measured relative to a trivial reconstruction such as a planar one.
Resumo:
The paper deals with the basic problem of adjusting a matrix gain in a discrete-time linear multivariable system. The object is to obtain a global convergence criterion, i.e. conditions under which a specified error signal asymptotically approaches zero and other signals in the system remain bounded for arbitrary initial conditions and for any bounded input to the system. It is shown that for a class of up-dating algorithms for the adjustable gain matrix, global convergence is crucially dependent on a transfer matrix G(z) which has a simple block diagram interpretation. When w(z)G(z) is strictly discrete positive real for a scalar w(z) such that w-1(z) is strictly proper with poles and zeros within the unit circle, an augmented error scheme is suggested and is proved to result in global convergence. The solution avoids feeding back a quadratic term as recommended in other schemes for single-input single-output systems.
Resumo:
The need for reexamination of the standard model of strong, weak, and electromagnetic interactions is discussed, especially with regard to 't Hooft's criterion of naturalness. It has been argued that theories with fundamental scalar fields tend to be unnatural at relatively low energies. There are two solutions to this problem: (i) a global supersymmetry, which ensures the absence of all the naturalness-violating effects associated with scalar fields, and (ii) composite structure of the scalar fields, which starts showing up at energy scales where unnatural effects would otherwise have appeared. With reference to the second solution, this article reviews the case for dynamical breaking of the gauge symmetry and the technicolor scheme for the composite Higgs boson. This new interaction, of the scaled-up quantum chromodynamic type, keeps the new set of fermions, the technifermions, together in the Higgs particles. It also provides masses for the electroweak gauge bosons W± and Z0 through technifermion condensate formation. In order to give masses to the ordinary fermions, a new interaction, the extended technicolor interaction, which would connect the ordinary fermions to the technifermions, is required. The extended technicolor group breaks down spontaneously to the technicolor group, possibly as a result of the "tumbling" mechanism, which is discussed here. In addition, the author presents schemes for the isospin breaking of mass matrices of ordinary quarks in the technicolor models. In generalized technicolor models with more than one doublet of technifermions or with more than one technicolor sector, we have additional low-lying degrees of freedom, the pseudo-Goldstone bosons. The pseudo-Goldstone bosons in the technicolor model of Dimopoulos are reviewed and their masses computed. In this context the vacuum alignment problem is also discussed. An effective Lagrangian is derived describing colorless low-lying degrees of freedom for models with two technicolor sectors in the combined limits of chiral symmetry and large number of colors and technicolors. Finally, the author discusses suppression of flavor-changing neutral currents in the extended technicolor models.