88 resultados para Conditional mean
Resumo:
In the course of this study, stiffness of a fibril array of mineralized collagen fibrils modeled with a mean field method was validated experimentally at site-matched two levels of tissue hierarchy using mineralized turkey leg tendons (MTLT). The applied modeling approaches allowed to model the properties of this unidirectional tissue from nanoscale (mineralized collagen fibrils) to macroscale (mineralized tendon). At the microlevel, the indentation moduli obtained with a mean field homogenization scheme were compared to the experimental ones obtained with microindentation. At the macrolevel, the macroscopic stiffness predicted with micro finite element (μFE) models was compared to the experimental stiffness measured with uniaxial tensile tests. Elastic properties of the elements in μFE models were injected from the mean field model or two-directional microindentations. Quantitatively, the indentation moduli can be properly predicted with the mean-field models. Local stiffness trends within specific tissue morphologies are very weak, suggesting additional factors responsible for the stiffness variations. At macrolevel, the μFE models underestimate the macroscopic stiffness, as compared to tensile tests, but the correlations are strong.
Resumo:
INTRODUCTION: The objective of this study was to evaluate the effects of two different mean arterial blood pressure (MAP) targets on needs for resuscitation, organ dysfunction, mitochondrial respiration and inflammatory response in a long-term model of fecal peritonitis. METHODS: Twenty-four anesthetized and mechanically ventilated pigs were randomly assigned (n = 8/group) to a septic control group (septic-CG) without resuscitation until death or one of two groups with resuscitation performed after 12 hours of untreated sepsis for 48 hours, targeting MAP 50-60 mmHg (low-MAP) or 75-85 mmHg (high-MAP). RESULTS: MAP at the end of resuscitation was 56 ± 13 mmHg (mean ± SD) and 76 ± 17 mmHg respectively, for low-MAP and high-MAP groups. One animal each in high- and low-MAP groups, and all animals in septic-CG died (median survival time: 21.8 hours, inter-quartile range: 16.3-27.5 hours). Norepinephrine was administered to all animals of the high-MAP group (0.38 (0.21-0.56) mcg/kg/min), and to three animals of the low-MAP group (0.00 (0.00-0.25) mcg/kg/min; P = 0.009). The high-MAP group had a more positive fluid balance (3.3 ± 1.0 mL/kg/h vs. 2.3 ± 0.7 mL/kg/h; P = 0.001). Inflammatory markers, skeletal muscle ATP content and hemodynamics other than MAP did not differ between low- and high-MAP groups. The incidence of acute kidney injury (AKI) after 12 hours of untreated sepsis was, respectively for low- and high-MAP groups, 50% (4/8) and 38% (3/8), and in the end of the study 57% (4/7) and 0% (P = 0.026). In septic-CG, maximal isolated skeletal muscle mitochondrial Complex I, State 3 respiration increased from 1357 ± 149 pmol/s/mg to 1822 ± 385 pmol/s/mg, (P = 0.020). In high- and low-MAP groups, permeabilized skeletal muscle fibers Complex IV-state 3 respiration increased during resuscitation (P = 0.003). CONCLUSIONS: The MAP targets during resuscitation did not alter the inflammatory response, nor affected skeletal muscle ATP content and mitochondrial respiration. While targeting a lower MAP was associated with increased incidence of AKI, targeting a higher MAP resulted in increased net positive fluid balance and vasopressor load during resuscitation. The long-term effects of different MAP targets need to be evaluated in further studies.
Resumo:
On the Limits of Greenwich Mean Time, or The Failure of a Modernist Revolution From the introduction of World Standard Time in 1884 to Einstein’s theory of relativity, the nature and regulation of time was a highly contested issue in modernism, with profound political, social and epistemological consequences. Modernist aesthetic sensibilities widely revolted against the increasingly strict rule of the clock, which, as Georg Simmel observed in “The Metropolis and Mental Life,” was established as the necessary basis of a capitalist, urban life. This paper will focus on the contending conceptions of time arising in key modernist texts by authors like Joyce, Woolf and Conrad. I will argue that the uniformity and regularity of time necessary to a rising capitalist society came under attack in a similar way by both modernist literary aesthetics and new scientific discoveries. However, while Einstein’s theory of relativity may have led to a subsequent change of paradigm in scientific thought, it has failed to significantly alter social and popular conceptions of time. Although alternative ways of thinking and living with time are proposed by modernist authors, they remain isolated aesthetic experiments, ineffectual against the regulatory pressure of economic and social structures. In this struggle about the nature of time, so I suggest, science and literature join force against a society that is increasingly governed by economic reason. The fact that they lost this struggle can serve as a striking illustration of an increasing shift of social influence from science and art towards economy.
Resumo:
The International Surface Temperature Initiative (ISTI) is striving towards substantively improving our ability to robustly understand historical land surface air temperature change at all scales. A key recently completed first step has been collating all available records into a comprehensive open access, traceable and version-controlled databank. The crucial next step is to maximise the value of the collated data through a robust international framework of benchmarking and assessment for product intercomparison and uncertainty estimation. We focus on uncertainties arising from the presence of inhomogeneities in monthly mean land surface temperature data and the varied methodological choices made by various groups in building homogeneous temperature products. The central facet of the benchmarking process is the creation of global-scale synthetic analogues to the real-world database where both the "true" series and inhomogeneities are known (a luxury the real-world data do not afford us). Hence, algorithmic strengths and weaknesses can be meaningfully quantified and conditional inferences made about the real-world climate system. Here we discuss the necessary framework for developing an international homogenisation benchmarking system on the global scale for monthly mean temperatures. The value of this framework is critically dependent upon the number of groups taking part and so we strongly advocate involvement in the benchmarking exercise from as many data analyst groups as possible to make the best use of this substantial effort.
Resumo:
Since 2010, the client base of online-trading service providers has grown significantly. Such companies enable small investors to access the stock market at advantageous rates. Because small investors buy and sell stocks in moderate amounts, they should consider fixed transaction costs, integral transaction units, and dividends when selecting their portfolio. In this paper, we consider the small investor’s problem of investing capital in stocks in a way that maximizes the expected portfolio return and guarantees that the portfolio risk does not exceed a prescribed risk level. Portfolio-optimization models known from the literature are in general designed for institutional investors and do not consider the specific constraints of small investors. We therefore extend four well-known portfolio-optimization models to make them applicable for small investors. We consider one nonlinear model that uses variance as a risk measure and three linear models that use the mean absolute deviation from the portfolio return, the maximum loss, and the conditional value-at-risk as risk measures. We extend all models to consider piecewise-constant transaction costs, integral transaction units, and dividends. In an out-of-sample experiment based on Swiss stock-market data and the cost structure of the online-trading service provider Swissquote, we apply both the basic models and the extended models; the former represent the perspective of an institutional investor, and the latter the perspective of a small investor. The basic models compute portfolios that yield on average a slightly higher return than the portfolios computed with the extended models. However, all generated portfolios yield on average a higher return than the Swiss performance index. There are considerable differences between the four risk measures with respect to the mean realized portfolio return and the standard deviation of the realized portfolio return.
Resumo:
RATIONALE In biomedical journals authors sometimes use the standard error of the mean (SEM) for data description, which has been called inappropriate or incorrect. OBJECTIVE To assess the frequency of incorrect use of SEM in articles in three selected cardiovascular journals. METHODS AND RESULTS All original journal articles published in 2012 in Cardiovascular Research, Circulation: Heart Failure and Circulation Research were assessed by two assessors for inappropriate use of SEM when providing descriptive information of empirical data. We also assessed whether the authors state in the methods section that the SEM will be used for data description. Of 441 articles included in this survey, 64% (282 articles) contained at least one instance of incorrect use of the SEM, with two journals having a prevalence above 70% and "Circulation: Heart Failure" having the lowest value (27%). In 81% of articles with incorrect use of SEM, the authors had explicitly stated that they use the SEM for data description and in 89% SEM bars were also used instead of 95% confidence intervals. Basic science studies had a 7.4-fold higher level of inappropriate SEM use (74%) than clinical studies (10%). LIMITATIONS The selection of the three cardiovascular journals was based on a subjective initial impression of observing inappropriate SEM use. The observed results are not representative for all cardiovascular journals. CONCLUSION In three selected cardiovascular journals we found a high level of inappropriate SEM use and explicit methods statements to use it for data description, especially in basic science studies. To improve on this situation, these and other journals should provide clear instructions to authors on how to report descriptive information of empirical data.
Resumo:
Two groundwater bodies, Grazer Feld and Leibnitzer Feld, with surface areas of 166 and 103 km2 respectively are characterised for the first time by measuring the combination of d18O/d2H, 3H/3He, 85Kr, CFC-11, CFC-12 and hydrochemistry in 34 monitoring wells in 2009/2010. The timescales of groundwater recharge have been characterised by 131 d18O measurements of well and surface water sampled on a seasonal basis. Most monitoring wells show a seasonal variation or indicate variable contributions of the main river Mur (0–30%, max. 70%) and/or other rivers having their recharge areas in higher altitudes. Combined d18O/d2H-measurements indicate that 65–75% of groundwater recharge in the unusual wet year of 2009 was from precipitation in the summer based on values from the Graz meteorological station. Monitoring wells downstream of gravel pit lakes show a clear evaporation trend. A boron–nitrate differentiation plot shows more frequent boron-rich water in the more urbanised Grazer Feld and more frequent nitrate-rich water in the more agricultural used Leibnitzer Feld indicating that a some of the nitrate load in the Grazer Feld comes from urban sewer water. Several lumped parameter models based on tritium input data from Graz and monthly data from the river Mur (Spielfeld) since 1977 yield a Mean Residence Time (MRT) for the Mur-water itself between 3 and 4 years in this area. Data from d18O, 3H/3He measurements at the Wagna lysimeter station supports the conclusion that 90% of the groundwaters in the Grazer Feld and 73% in the Leibnitzer Feld have MRTs of <5 years. Only in a few groundwaters were MRTs of 6–10 or 11–25 years as a result of either a long-distance water inflow in the basins or due to longer flow path in somewhat deeper wells (>20 m) with relative thicker unsaturated zones. The young MRT of groundwater from two monitoring wells in the Leibnitzer Feld was confirmed by 85Kr-measurements. Most CFC-11 and CFC-12 concentrations in the groundwater exceed the equilibration concentrations of modern concentrations in water and are therefore unsuitable for dating purposes. An enrichment factor up to 100 compared to atmospheric equilibrium concentrations and the obvious correlation of CFC-12 with SO4, Na, Cl and B in the ground waters of the Grazer Feld suggest that waste water in contact with CFC-containing material above and below ground is the source for the contamination. The dominance of very young groundwater (<5 years) indicates a recent origin of the contamination by nitrate and many other components observed in parts of the groundwater bodies. Rapid measures to reduce those sources are needed to mitigate against further deterioration of these waters.
Resumo:
Introduction: Organisational changes in sports federations are often associated with a drift from a volunteer driven to an increasingly business-like phenomenon (Shilbury & Ferkins, 2011). This process of transfor-mation is be called as “professionalization”. Accordingly, professionalization seems to be an appropriate strategy for sport organisations in order to meet organizational pressure due to challenges of a more complex and dynamic changing environment adequately. Despite the increasing research interest and the attempts for systematization on the phenomenon of professionalization it still remains unclear what does the term professionalization exactly mean (Dowling et al., 2014). Thus, there is a lack of a consistent concept of professionalization that is needed in order to explore different facets and perspectives of this phenomenon more validly. Against this background following question emerged: What is the suitable concept of professionalization for analyzing the actual ongoing processes of change, adaption or transformation in sport federations? Methods: Dealing with this question, following two-step approach was choosen: (1) In a first step a scholar’s perspective at professionalisation of sport organisations will be displayed in order to explore both the common ground as well as divergences and inconsistencies in previous approaches. Therefore, a literature review is indicated. (2) In a second step, and in contrast to previous studies we will consider a practical point of view by a so called second-order observation of experts to gain valuable insights into current thinking and acting towards professionalization in sport federations. In doing so, a hermeneutical approach is used, which is about understanding the meaning of contexts by grasping the everyday world, and draw insight and meaning from it (Shilbury et al., 2013). Accordance with hermeneutics, the explorative interpretive knowledge of expert interviews was used. The interviews were conducted with a sample of six selected experts, who have both dedicated insider knowledge and the overall view of all Swiss sport federations. Results and discussion: The summaries of literature review could be categorized into two research currents. The one defines professionalization as a structural process towards professional status of occupations. The other defines it in a broader sense as an organisational change towards a business-like approach. Whereas the first perspective there is a broad scientific consensus that second isn’t that clear, however. Explorative analysis of interview data reveals different themes in relation to professionalization of sports federation. First theme deals with a changed philosophy as more strategic alignment towards for-profit, efficiency and quality orientation. Second theme refers to paid work associated with more competence orientation and balanced governance between paid and voluntary work. Third theme deals with acting shift towards more rationalization and efficiency by implementation of innovative management and communication tools. Based on findings of both our review of scholar`s perspective as well insights from experts we will provide – in the sense of synthesis – a more clear understanding of what does professionalization mean that can be useful in terms of further studies. References: Dowling, M., Edwards, J. & Washington, M. (2014). Understanding the concept of professionalisation in sport management research. Sport Management Review, 17 (4), 520–529. Shilbury, D., Ferkins, L. & Smythe, L. (2013). Sport governance encounters: Insights from lived experiences. Sport Management Review, 16,349–363. Shilbury, D., & Ferkins, L. (2011). Professionalisation, sport governance and strategic capability. Managing Leisure, 16, 108–127.
Resumo:
Multi-objective optimization algorithms aim at finding Pareto-optimal solutions. Recovering Pareto fronts or Pareto sets from a limited number of function evaluations are challenging problems. A popular approach in the case of expensive-to-evaluate functions is to appeal to metamodels. Kriging has been shown efficient as a base for sequential multi-objective optimization, notably through infill sampling criteria balancing exploitation and exploration such as the Expected Hypervolume Improvement. Here we consider Kriging metamodels not only for selecting new points, but as a tool for estimating the whole Pareto front and quantifying how much uncertainty remains on it at any stage of Kriging-based multi-objective optimization algorithms. Our approach relies on the Gaussian process interpretation of Kriging, and bases upon conditional simulations. Using concepts from random set theory, we propose to adapt the Vorob’ev expectation and deviation to capture the variability of the set of non-dominated points. Numerical experiments illustrate the potential of the proposed workflow, and it is shown on examples how Gaussian process simulations and the estimated Vorob’ev deviation can be used to monitor the ability of Kriging-based multi-objective optimization algorithms to accurately learn the Pareto front.
Resumo:
AIMS To assess incidence rates (IRs) of and identify risk factors for incident severe hypoglycaemia in patients with type 2 diabetes newly treated with antidiabetic drugs. METHODS Using the UK-based General Practice Research Database, we performed a retrospective cohort study between 1994 and 2011 and a nested case-control analysis. Ten controls from the population at risk were matched to each case with a recorded severe hypoglycaemia during follow-up on general practice, years of history in the database and calendar time. Using multivariate conditional logistic regression analyses, we adjusted for potential confounders. RESULTS Of 130,761 patients with newly treated type 2 diabetes (mean age 61.7 ± 13.0 years), 690 (0.5%) had an incident episode of severe hypoglycaemia recorded [estimated IR 11.97 (95% confidence interval, CI, 11.11-12.90) per 10,000 person-years (PYs)]. The IR was markedly higher in insulin users [49.64 (95% CI, 44.08-55.89) per 10,000 PYs] than in patients not using insulin [8.03 (95% CI, 7.30-8.84) per 10,000 PYs]. Based on results of the nested case-control analysis increasing age [≥ 75 vs. 20-59 years; adjusted odds ratio (OR), 2.27; 95% CI, 1.65-3.12], cognitive impairment/dementia (adjusted OR, 2.00; 95% CI, 1.37-2.91), renal failure (adjusted OR, 1.34; 95% CI, 1.04-1.71), current use of sulphonylureas (adjusted OR, 4.45; 95% CI, 3.53-5.60) and current insulin use (adjusted OR, 11.83; 95% CI, 9.00-15.54) were all associated with an increased risk of severe hypoglycaemia. CONCLUSIONS Severe hypoglycaemia was recorded in 12 cases per 10,000 PYs. Risk factors for severe hypoglycaemia included increasing age, renal failure, cognitive impairment/dementia, and current use of insulin or sulphonylureas.