917 resultados para two-factor models


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Flood prediction systems rely on good quality precipitation input data and forecasts to drive hydrological models. Most precipitation data comes from daily stations with a good spatial coverage. However, some flood events occur on sub-daily time scales and flood prediction systems could benefit from using models calibrated on the same time scale. This study compares precipitation data aggregated from hourly stations (HP) and data disaggregated from daily stations (DP) with 6-hourly forecasts from ECMWF over the time period 1 October 2006–31 December 2009. The HP and DP data sets were then used to calibrate two hydrological models, LISFLOOD-RR and HBV, and the latter was used in a flood case study. The HP scored better than the DP when evaluated against the forecast for lead times up to 4 days. However, this was not translated in the same way to the hydrological modelling, where the models gave similar scores for simulated runoff with the two datasets. The flood forecasting study showed that both datasets gave similar hit rates whereas the HP data set gave much smaller false alarm rates (FAR). This indicates that using sub-daily precipitation in the calibration and initiation of hydrological models can improve flood forecasting.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cancer cachexia is a multifactorial syndrome that includes muscle wasting and inflammation. As gut microbes influence host immunity and metabolism, we investigated the role of the gut microbiota in the therapeutic management of cancer and associated cachexia. A community-wide analysis of the caecal microbiome in two mouse models of cancer cachexia (acute leukaemia or subcutaneous transplantation of colon cancer cells) identified common microbial signatures, including decreased Lactobacillus spp. and increased Enterobacteriaceae and Parabacteroides goldsteinii/ASF 519. Building on this information, we administered a synbiotic containing inulin-type fructans and live Lactobacillus reuteri 100-23 to leukaemic mice. This treatment restored the Lactobacillus population and reduced the Enterobacteriaceae levels. It also reduced hepatic cancer cell proliferation, muscle wasting and morbidity, and prolonged survival. Administration of the synbiotic was associated with restoration of the expression of antimicrobial proteins controlling intestinal barrier function and gut immunity markers, but did not impact the portal metabolomics imprinting of energy demand. In summary, this study provided evidence that the development of cancer outside the gut can impact intestinal homeostasis and the gut microbial ecosystem and that a synbiotic intervention, by targeting some alterations of the gut microbiota, confers benefits to the host, prolonging survival and reducing cancer proliferation and cachexia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A numerical model embodying the concepts of the Cowley-Lockwood (Cowley and Lockwood, 1992, 1997) paradigm has been used to produce a simple Cowley– Lockwood type expanding flow pattern and to calculate the resulting change in ion temperature. Cross-correlation, fixed threshold analysis and threshold relative to peak are used to determine the phase speed of the change in convection pattern, in response to a change in applied reconnection. Each of these methods fails to fully recover the expansion of the onset of the convection response that is inherent in the simulations. The results of this study indicate that any expansion of the convection pattern will be best observed in time-series data using a threshold which is a fixed fraction of the peak response. We show that these methods used to determine the expansion velocity can be used to discriminate between the two main models for the convection response to a change in reconnection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine the reasons for the lack of research attention paid to the Middle East (ME) and Africa regions. In particular, this study seeks to identify the reasons for and implications of the paucity of ME- and Africa-based studies in high-quality international journals in the marketing field with a specific focus on the challenges in conducting and publishing research on these regions. Design/methodology/approach – The authors conducted a systematic review of the literature on the ME and Africa regions to identify papers published in 23 high-quality marketing, international business, and advertising journals. This search resulted in 301 articles, among which 125 articles were based on primary or secondary data collected from a local source in those regions. The authors of these 125 articles constitute the Delphi study sample. These academics provided input in an effort to reach a consensus regarding the two proposed models of academic research in both regions. Findings – This paper differs from previous studies, where academic freedom emerged as the most important inhibitor to conducting and publishing research. The most frequently mentioned challenges in conducting research in Africa were access to data, data collection issues, diversity of the region, and lack of research support infrastructure. For the ME, the most often described challenges included validity and reliability of data, language barriers, data collection issues, and availability of a network of researchers. Editors’ and reviewers’ low interest and limited knowledge were ranked high in both regions. South Africa, Israel, and Turkey emerged as outliers, in which research barriers were less challenging than in the rest of the two regions. The authors attribute this difference to the high incidence of US-trained or US-based scholars originating from these countries. Originality/value – To the best of the knowledge, no marketing studies have discussed the problems of publishing in high-quality international journals of marketing, international business, and advertising for either region. Thus, most of the issues the authors discuss in this paper offer new insightful results while supplementing previous research on the challenges of conducting and publishing research on specific world regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A new global synthesis and biomization of long (>40 kyr) pollen-data records is presented, and used with simulations from the HadCM3 and FAMOUS climate models to analyse the dynamics of the global terrestrial biosphere and carbon storage over the last glacial–interglacial cycle. Global modelled (BIOME4) biome distributions over time generally agree well with those inferred from pollen data. The two climate models show good agreement in global net primary productivity (NPP). NPP is strongly influenced by atmospheric carbon dioxide (CO2) concentrations through CO2 fertilization. The combined effects of modelled changes in vegetation and (via a simple model) soil carbon result in a global terrestrial carbon storage at the Last Glacial Maximum that is 210–470 Pg C less than in pre-industrial time. Without the contribution from exposed glacial continental shelves the reduction would be larger, 330–960 Pg C. Other intervals of low terrestrial carbon storage include stadial intervals at 108 and 85 ka BP, and between 60 and 65 ka BP during Marine Isotope Stage 4. Terrestrial carbon storage, determined by the balance of global NPP and decomposition, influences the stable carbon isotope composition (δ13C) of seawater because terrestrial organic carbon is depleted in 13C. Using a simple carbon-isotope mass balance equation we find agreement in trends between modelled ocean δ13C based on modelled land carbon storage, and palaeo-archives of ocean δ13C, confirming that terrestrial carbon storage variations may be important drivers of ocean δ13C changes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increases in cloud optical depth and liquid water path (LWP) are robust features of global warming model simulations in high latitudes, yielding a negative shortwave cloud feedback, but the mechanisms are still uncertain. We assess the importance of microphysical processes for the negative optical depth feedback by perturbing temperature in the microphysics schemes of two aquaplanet models, both of which have separate prognostic equations for liquid water and ice. We find that most of the LWP increase with warming is caused by a suppression of ice microphysical processes in mixed-phase clouds, resulting in reduced conversion efficiencies of liquid water to ice and precipitation. Perturbing the temperature-dependent phase partitioning of convective condensate also yields a small LWP increase. Together, the perturbations in large-scale microphysics and convective condensate partitioning explain more than two-thirds of the LWP response relative to a reference case with increased SSTs, and capture all of the vertical structure of the liquid water response. In support of these findings, we show the existence of a very robust positive relationship between monthly-mean LWP and temperature in CMIP5 models and observations in mixed-phase cloud regions only. In models, the historical LWP sensitivity to temperature is a good predictor of the forced global warming response poleward of about 45°, although models appear to overestimate the LWP response to warming compared to observations. We conclude that in climate models, the suppression of ice-phase microphysical processes that deplete cloud liquid water is a key driver of the LWP increase with warming and of the associated negative shortwave cloud feedback.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Magnetic clouds (MCs) are a subset of interplanetary coronal mass ejections (ICMEs) characterised primarily by a smooth rotation in the magnetic field direction indicative of the presence of a magnetic flux rope. Energetic particle signatures suggest MC flux ropes remain magnetically connected to the Sun at both ends, leading to widely used model of global MC structure as an extended flux rope, with a loop-like axis stretching out from the Sun into the heliosphere and back to the Sun. The time of flight of energetic particles, however, suggests shorter magnetic field line lengths than such a continuous twisted flux rope would produce. In this study, two simple models are compared with observed flux rope axis orientations of 196 MCs to show that the flux rope structure is confined to the MC leading edge. The magnetic cloud “legs,” which magnetically connect the flux rope to the Sun, are not recognisable as MCs and thus are unlikely to contain twisted flux rope fields. Spacecraft encounters with these non-flux rope legs may provide an explanation for the frequent observation of non-magnetic cloud ICMEs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[1] The retrieval of aerosol optical depth (Ta) over land by satellite remote sensing is still a challenge when a high spatial resolution is required. This study presents a tool that uses satellite measurements to dynamically identify the aerosol optical model that best represents the optical properties of the aerosol present in the atmosphere. We use aerosol critical reflectance to identify the single scattering albedo of the aerosol layer. Two case studies show that the Sao Paulo region can have different aerosol properties and demonstrates how the dynamic methodology works to identify those differences to obtain a better T a retrieval. The methodology assigned the high single scattering albedo aerosol model (pi o( lambda = 0.55) = 0.90) to the case where the aerosol source was dominated by biomass burning and the lower pi(o) model (pi(o) (lambda = 0.55) = 0.85) to the case where the local urban aerosol had the dominant influence on the region, as expected. The dynamic methodology was applied using cloud-free data from 2002 to 2005 in order to retrieve Ta with Moderate Resolution Imaging Spectroradiometer ( MODIS). These results were compared with collocated data measured by AERONET in Sao Paulo. The comparison shows better results when the dynamic methodology using two aerosol optical models is applied (slope 1.06 +/- 0.08 offset 0.01 +/- 0.02 r(2) 0.6) than when a single and fixed aerosol model is used (slope 1.48 +/- 0.11 and offset - 0.03 +/- 0.03 r(2) 0.6). In conclusion the dynamical methodology is shown to work well with two aerosol models. Further studies are necessary to evaluate the methodology in other regions and under different conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We expect to observe parton saturation in a future electron-ion collider. In this Letter we discuss this expectation in more detail considering two different models which are in good agreement with the existing experimental data on nuclear structure functions. In particular, we study the predictions of saturation effects in electron-ion collisions at high energies, using a generalization for nuclear targets of the b-CGC model, which describes the ep HERA quite well. We estimate the total. longitudinal and charm structure functions in the dipole picture and compare them with the predictions obtained using collinear factorization and modern sets of nuclear parton distributions. Our results show that inclusive observables are not very useful in the search for saturation effects. In the small x region they are very difficult to disentangle from the predictions of the collinear approaches. This happens mainly because of the large uncertainties in the determination of the nuclear parton distribution functions. On the other hand, our results indicate that the contribution of diffractive processes to the total cross section is about 20% at large A and small Q(2), allowing for a detailed study of diffractive observables. The study of diffractive processes becomes essential to observe parton Saturation. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper we study the accumulated claim in some fixed time period, skipping the classical assumption of mutual independence between the variables involved. Two basic models are considered: Model I assumes that any pair of claims are equally correlated which means that the corresponding square-integrable sequence is exchangeable one. Model 2 states that the correlations between the adjacent claims are the same. Recurrence and explicit expressions for the joint probability generating function are derived and the impact of the dependence parameter (correlation coefficient) in both models is examined. The Markov binomial distribution is obtained as a particular case under assumptions of Model 2. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis is an application of the Almost Ideal Demand System approach of Deaton and Muellbauer,1980, for a particular pharmaceutical, Citalopram, in which GORMAN´s (1971) multi-stage budgeting approach is applied basically since it is one of the most useful approach in estimating demand for differentiated products. Citalopram is an antidepressant drug that is used in the treatment of major depression. As for most other pharmaceuticals whose the patent has expired, there exist branded and generic versions of Citalopram. This paper is aimed to define its demand system with two stage models for the branded version and five generic versions, and to show whether generic versions are able to compete with the branded version. I calculated the own price elasticities, and it made me possible to compare and make a conclusion about the consumers’ choices over the brand and generic drugs. Even though the models need for being developed with some additional variables, estimation results of models and uncompensated price elasticities indicated that the branded version has still power in the market, and generics are able to compete with lower prices. One important point that has to be taken into consideration is that the Swedish pharmaceutical market faced a reform on October 1, 2002, that aims to make consumer better informed about the price and decrease the overall expenditures for pharmaceuticals. Since there were not significantly enough generic sales to take into calculation before the reform, my paper covers sales after the reform.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Syftet med denna studie var att undersöka vad som ger medarbetare välbefinnande och positiva känslor på arbetsplatsen. Studien genomfördes genom kvalitativa intervjuer med sex medarbetare på fyra bank- och försäkringskontor i mellersta Sverige. Resultatet analyserades med hjälp av tidigare forskning samt teorier. Resultatet visar att det som skapar positiva känslor och välbefinnande i stor grad är relationer, dels med kollegor men också med kunder. Positiva känslor kan ha en väldigt hög spridningsförmåga bland kollegor, kunder och i organisationen. Det visade också att det är viktigt att det finns en balans mellan arbetsliv och privatliv samt mellan krav och kontroll. Att se på välbefinnande på arbetet ur ett samspelsperspektiv, där man integrerar både individ och organisation, visade sig också vara viktigt för att framgångsrikt kunna arbeta med dessa frågor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objectives: To translate and evaluate the psychometric properties of the Swedish version of the Fear of Complications Questionnaire. Design: Cross-sectional study design and scale development. Settings: Totally, 469 adults (response rate 63.5%) with Type 1 diabetes completed the questionnaires. Participants were recruited from two university hospitals in Sweden. Participants: Eligible patients were those who met the following inclusion criteria: diagnosed with Type 1 diabetes, diabetes duration of at least 1 year and aged at least 18 years. Methods: The Fear of Complications Questionnaire was translated using the forward-backward translation method. Factor analyses of the questionnaire were performed in two steps using both exploratory and confirmatory factor analysis. Convergent validity was examined using the Hospital Anxiety and Depression Scale and the Fear of Hypoglycaemia Fear Survey. Internal consistency was estimated using Cronbach’s alpha.Results: Exploratory factor analysis supported a two-factor solution. One factor contained three items having to do with fear of kidney-related complications and one factor included the rest of items concerning fear of other diabetes-related complications, as well as fear of complications in general. Internal consistency was high Cronbach’s alpha 0.96. The findings also gave support for convergent validity, with significant positive correlations between measures (r = 0.51 to 0.54). Conclusion: The clinical relevance of the identified two-factor model with a structure of one dominant subdomain may be considered. We suggest, however a one-factor model covering all the items as a relevant basis to assess fear of complications among people with Type 1 diabetes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A three-dimensional time-dependent hydrodynamic and heat transport model of Lake Binaba, a shallow and small dam reservoir in Ghana, emphasizing the simulation of dynamics and thermal structure has been developed. Most numerical studies of temperature dynamics in reservoirs are based on one- or two-dimensional models. These models are not applicable for reservoirs characterized with complex flow pattern and unsteady heat exchange between the atmosphere and water surface. Continuity, momentum and temperature transport equations have been solved. Proper assignment of boundary conditions, especially surface heat fluxes, has been found crucial in simulating the lake’s hydrothermal dynamics. This model is based on the Reynolds Average Navier-Stokes equations, using a Boussinesq approach, with a standard k − ε turbulence closure to solve the flow field. The thermal model includes a heat source term, which takes into account the short wave radiation and also heat convection at the free surface, which is function of air temperatures, wind velocity and stability conditions of atmospheric boundary layer over the water surface. The governing equations of the model have been solved by OpenFOAM; an open source, freely available CFD toolbox. As its core, OpenFOAM has a set of efficient C++ modules that are used to build solvers. It uses collocated, polyhedral numerics that can be applied on unstructured meshes and can be easily extended to run in parallel. A new solver has been developed to solve the hydrothermal model of lake. The simulated temperature was compared against a 15 days field data set. Simulated and measured temperature profiles in the probe locations show reasonable agreement. The model might be able to compute total heat storage of water bodies to estimate evaporation from water surface.