929 resultados para Distributed lag model


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we present a top down approach for integrated process modelling and distributed process execution. The integrated process model can be utilized for global monitoring and visualization and distributed process models for local execution. Our main focus in this paper is the presentation of the approach to support automatic generation and linking of distributed process models from an integrated process definition.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper contributes to the literature by empirically examining whether the influence of public debt on economic growth differs between the short and the long run and presents different patterns across euro-area countries. To this end, we use annual data from both central and peripheral countries of the European Economic and Monetary Union (EMU) for the 1960-2012 period and estimate a growth model augmented for public debt using the Autoregressive Distributed Lag (ARDL) bounds testing approach. Our findings tend to support the view that public debt always has a negative impact on the long-run performance of EMU countries, whilst its short-run effect may be positive depending on the country.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Patients with chronic obstructive pulmonary disease (COPD) can have recurrent disease exacerbations triggered by several factors, including air pollution. Visits to the emergency respiratory department can be a direct result of short-term exposure to air pollution. The aim of this study was to investigate the relationship between the daily number of COPD emergency department visits and the daily environmental air concentrations of PM(10), SO(2), NO(2), CO and O(3) in the City of Sao Paulo, Brazil. Methods: The sample data were collected between 2001 and 2003 and are categorised by gender and age. Generalised linear Poisson regression models were adopted to control for both short-and long-term seasonal changes as well as for temperature and relative humidity. The non-linear dependencies were controlled using a natural cubic spline function. Third-degree polynomial distributed lag models were adopted to estimate both lag structures and the cumulative effects of air pollutants. Results: PM(10) and SO(2) readings showed both acute and lagged effects on COPD emergency department visits. Interquartile range increases in their concentration (28.3 mg/m(3) and 7.8 mg/m(3), respectively) were associated with a cumulative 6-day increase of 19% and 16% in COPD admissions, respectively. An effect on women was observed at lag 0, and among the elderly the lag period was noted to be longer. Increases in CO concentration showed impacts in the female and elderly groups. NO(2) and O(3) presented mild effects on the elderly and in women, respectively. Conclusion: These results indicate that air pollution affects health in a gender-and age-specific manner and should be considered a relevant risk factor that exacerbates COPD in urban environments.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Predicted area under curve (AUC), mean transit time (MTT) and normalized variance (CV2) data have been compared for parent compound and generated metabolite following an impulse input into the liver, Models studied were the well-stirred (tank) model, tube model, a distributed tube model, dispersion model (Danckwerts and mixed boundary conditions) and tanks-in-series model. It is well known that discrimination between models for a parent solute is greatest when the parent solute is highly extracted by the liver. With the metabolite, greatest model differences for MTT and CV2 occur when parent solute is poorly extracted. In all cases the predictions of the distributed tube, dispersion, and tasks-in-series models are between the predictions of the rank and tube models. The dispersion model with mixed boundary conditions yields identical predictions to those for the distributed tube model (assuming an inverse gaussian distribution of tube transit times). The dispersion model with Danckwerts boundary conditions and the tanks-in series models give similar predictions to the dispersion (mixed boundary conditions) and the distributed tube. The normalized variance for parent compound is dependent upon hepatocyte permeability only within a distinct range of permeability values. This range is similar for each model but the order of magnitude predicted for normalized variance is model dependent. Only for a one-compartment system is the MIT for generated metabolite equal to the sum of MTTs for the parent compound and preformed metabolite administered as parent.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: Myocardial infarction is an acute and severe cardiovascular disease that generally leads to patient admissions to intensive care units and few cases are initially admitted to infirmaries. The objective of the study was to assess whether estimates of air pollution effects on myocardial infarction morbidity are modified by the source of health information. METHODS: The study was carried out in hospitals of the Brazilian Health System in the city of São Paulo, Southern Brazil. A time series study (1998-1999) was performed using two outcomes: infarction admissions to infirmaries and to intensive care units, both for people older than 64 years of age. Generalized linear models controlling for seasonality (long and short-term trends) and weather were used. The eight-day cumulative effects of air pollutants were assessed using third degree polynomial distributed lag models. RESULTS: Almost 70% of daily hospital admissions due to myocardial infarction were to infirmaries. Despite that, the effects of air pollutants on infarction were higher for intensive care units admissions. All pollutants were positively associated with the study outcomes but SO2 presented the strongest statistically significant association. An interquartile range increase on SO2 concentration was associated with increases of 13% (95% CI: 6-19) and 8% (95% CI: 2-13) of intensive care units and infirmary infarction admissions, respectively. CONCLUSIONS: It may be assumed there is a misclassification of myocardial infarction admissions to infirmaries leading to overestimation. Also, despite the absolute number of events, admissions to intensive care units data provides a more adequate estimate of the magnitude of air pollution effects on infarction admissions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

ABSTRACT OBJECTIVE To analyze the impact of air pollution on respiratory and cardiovascular morbidity of children and adults in the city of Vitoria, state of Espirito Santo. METHODS A study was carried out using time-series models via Poisson regression from hospitalization and pollutant data in Vitoria, ES, Southeastern Brazil, from 2001 to 2006. Fine particulate matter (PM10), sulfur dioxide (SO2), and ozone (O3) were tested as independent variables in simple and cumulative lags of up to five days. Temperature, humidity and variables indicating weekdays and city holidays were added as control variables in the models. RESULTS For each increment of 10 µg/m3 of the pollutants PM10, SO2, and O3, the percentage of relative risk (%RR) for hospitalizations due to total respiratory diseases increased 9.67 (95%CI 11.84-7.54), 6.98 (95%CI 9.98-4.17) and 1.93 (95%CI 2.95-0.93), respectively. We found %RR = 6.60 (95%CI 9.53-3.75), %RR = 5.19 (95%CI 9.01-1.5), and %RR = 3.68 (95%CI 5.07-2.31) for respiratory diseases in children under the age of five years for PM10, SO2, and O3, respectively. Cardiovascular diseases showed a significant relationship with O3, with %RR = 2.11 (95%CI 3.18-1.06). CONCLUSIONS Respiratory diseases presented a stronger and more consistent relationship with the pollutants researched in Vitoria. A better dose-response relationship was observed when using cumulative lags in polynomial distributed lag models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A network of 25 sonic stage sensors were deployed in the Squaw Creek basin upstream from Ames Iowa to determine if the state-of-the-art distributed hydrological model CUENCAS can produce reliable information for all road crossings including those that cross small creeks draining basins as small as 1 sq. mile. A hydraulic model was implemented for the major tributaries of the Squaw Creek where IFC sonic instruments were deployed and it was coupled to CUENCAS to validate the predictions made at small tributaries in the basin. This study demonstrates that the predictions made by the hydrological model at internal locations in the basins are as accurate as the predictions made at the outlet of the basin. Final rating curves based on surveyed cross sections were developed for the 22 IFC-bridge sites that are currently operating, and routine forecast is provided at those locations (see IFIS). Rating curves were developed for 60 additional bridge locations in the basin, however, we do not use those rating curves for routine forecast because the lack of accuracy of LiDAR derived cross sections is not optimal. The results of our work form the basis for two papers that have been submitted for publication to the Journal of Hydrological Engineering. Peer review of our work will gives a strong footing to our ability to expand our results from the pilot Squaw Creek basin to all basins in Iowa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Already in ancient Greece, Hippocrates postulated that disease showed a seasonal pattern characterised by excess winter mortality. Since then, several studies have confirmed this finding, and it was generally accepted that the increase in winter mortality was mostly due to respiratory infections and seasonal influenza. More recently, it was shown that cardiovascular disease (CVD) mortality also displayed such seasonality, and that the magnitude of the seasonal effect increased from the poles to the equator. The recent study by Yang et al assessed CVD mortality attributable to ambient temperature using daily data from 15 cities in China for years 2007-2013, including nearly two million CVD deaths. A high temperature variability between and within cities can be observed (figure 1). They used sophisticated statistical methodology to account for the complex temperature-mortality relationship; first, distributed lag non-linear models combined with quasi-Poisson regression to obtain city-specific estimates, taking into account temperature, relative humidity and atmospheric pressure; then, a meta-analysis to obtain the pooled estimates. The results confirm the winter excess mortality as reported by the Eurowinter3 and other4 groups, but they show that the magnitude of ambient temperature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We investigate the importance of the labour mobility of inventors, as well as the scale, extent and density of their collaborative research networks, for regional innovation outcomes. To do so, we apply a knowledge production function framework at the regional level and include inventors’ networks and their labour mobility as regressors. Our empirical approach takes full account of spatial interactions by estimating a spatial lag model together, where necessary, with a spatial error model. In addition, standard errors are calculated using spatial heteroskedasticity and autocorrelation consistent estimators to ensure their robustness in the presence of spatial error autocorrelation and heteroskedasticity of unknown form. Our results point to the existence of a robust positive correlation between intraregional labour mobility and regional innovation, whilst the relationship with networks is less clear. However, networking across regions positively correlates with a region’s innovation intensity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditional psychometric theory and practice classify people according to broad ability dimensions but do not examine how these mental processes occur. Hunt and Lansman (1975) proposed a 'distributed memory' model of cognitive processes with emphasis on how to describe individual differences based on the assumption that each individual possesses the same components. It is in the quality of these components ~hat individual differences arise. Carroll (1974) expands Hunt's model to include a production system (after Newell and Simon, 1973) and a response system. He developed a framework of factor analytic (FA) factors for : the purpose of describing how individual differences may arise from them. This scheme is to be used in the analysis of psychometric tes ts . Recent advances in the field of information processing are examined and include. 1) Hunt's development of differences between subjects designated as high or low verbal , 2) Miller's pursuit of the magic number seven, plus or minus two, 3) Ferguson's examination of transfer and abilities and, 4) Brown's discoveries concerning strategy teaching and retardates . In order to examine possible sources of individual differences arising from cognitive tasks, traditional psychometric tests were searched for a suitable perceptual task which could be varied slightly and administered to gauge learning effects produced by controlling independent variables. It also had to be suitable for analysis using Carroll's f ramework . The Coding Task (a symbol substitution test) found i n the Performance Scale of the WISe was chosen. Two experiments were devised to test the following hypotheses. 1) High verbals should be able to complete significantly more items on the Symbol Substitution Task than low verbals (Hunt, Lansman, 1975). 2) Having previous practice on a task, where strategies involved in the task may be identified, increases the amount of output on a similar task (Carroll, 1974). J) There should be a sUbstantial decrease in the amount of output as the load on STM is increased (Miller, 1956) . 4) Repeated measures should produce an increase in output over trials and where individual differences in previously acquired abilities are involved, these should differentiate individuals over trials (Ferguson, 1956). S) Teaching slow learners a rehearsal strategy would improve their learning such that their learning would resemble that of normals on the ,:same task. (Brown, 1974). In the first experiment 60 subjects were d.ivided·into high and low verbal, further divided randomly into a practice group and nonpractice group. Five subjects in each group were assigned randomly to work on a five, seven and nine digit code throughout the experiment. The practice group was given three trials of two minutes each on the practice code (designed to eliminate transfer effects due to symbol similarity) and then three trials of two minutes each on the actual SST task . The nonpractice group was given three trials of two minutes each on the same actual SST task . Results were analyzed using a four-way analysis of variance . In the second experiment 18 slow learners were divided randomly into two groups. one group receiving a planned strategy practioe, the other receiving random practice. Both groups worked on the actual code to be used later in the actual task. Within each group subjects were randomly assigned to work on a five, seven or nine digit code throughout. Both practice and actual tests consisted on three trials of two minutes each. Results were analyzed using a three-way analysis of variance . It was found in t he first experiment that 1) high or low verbal ability by itself did not produce significantly different results. However, when in interaction with the other independent variables, a difference in performance was noted . 2) The previous practice variable was significant over all segments of the experiment. Those who received previo.us practice were able to score significantly higher than those without it. J) Increasing the size of the load on STM severely restricts performance. 4) The effect of repeated trials proved to be beneficial. Generally, gains were made on each successive trial within each group. S) In the second experiment, slow learners who were allowed to practice randomly performed better on the actual task than subjeots who were taught the code by means of a planned strategy. Upon analysis using the Carroll scheme, individual differences were noted in the ability to develop strategies of storing, searching and retrieving items from STM, and in adopting necessary rehearsals for retention in STM. While these strategies may benef it some it was found that for others they may be harmful . Temporal aspects and perceptual speed were also found to be sources of variance within individuals . Generally it was found that the largest single factor i nfluencing learning on this task was the repeated measures . What e~ables gains to be made, varies with individuals . There are environmental factors, specific abilities, strategy development, previous learning, amount of load on STM , perceptual and temporal parameters which influence learning and these have serious implications for educational programs .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

L’évolution récente des commutateurs de sélection de longueurs d’onde (WSS -Wavelength Selective Switch) favorise le développement du multiplexeur optique d’insertionextraction reconfigurable (ROADM - Reconfigurable Optical Add/Drop Multiplexers) à plusieurs degrés sans orientation ni coloration, considéré comme un équipement fort prometteur pour les réseaux maillés du futur relativement au multiplexage en longueur d’onde (WDM -Wavelength Division Multiplexing ). Cependant, leur propriété de commutation asymétrique complique la question de l’acheminement et de l’attribution des longueur d’ondes (RWA - Routing andWavelength Assignment). Or la plupart des algorithmes de RWA existants ne tiennent pas compte de cette propriété d’asymétrie. L’interruption des services causée par des défauts d’équipements sur les chemins optiques (résultat provenant de la résolution du problème RWA) a pour conséquence la perte d’une grande quantité de données. Les recherches deviennent ainsi incontournables afin d’assurer la survie fonctionnelle des réseaux optiques, à savoir, le maintien des services, en particulier en cas de pannes d’équipement. La plupart des publications antérieures portaient particulièrement sur l’utilisation d’un système de protection permettant de garantir le reroutage du trafic en cas d’un défaut d’un lien. Cependant, la conception de la protection contre le défaut d’un lien ne s’avère pas toujours suffisante en termes de survie des réseaux WDM à partir de nombreux cas des autres types de pannes devenant courant de nos jours, tels que les bris d’équipements, les pannes de deux ou trois liens, etc. En outre, il y a des défis considérables pour protéger les grands réseaux optiques multidomaines composés de réseaux associés à un domaine simple, interconnectés par des liens interdomaines, où les détails topologiques internes d’un domaine ne sont généralement pas partagés à l’extérieur. La présente thèse a pour objectif de proposer des modèles d’optimisation de grande taille et des solutions aux problèmes mentionnés ci-dessus. Ces modèles-ci permettent de générer des solutions optimales ou quasi-optimales avec des écarts d’optimalité mathématiquement prouvée. Pour ce faire, nous avons recours à la technique de génération de colonnes afin de résoudre les problèmes inhérents à la programmation linéaire de grande envergure. Concernant la question de l’approvisionnement dans les réseaux optiques, nous proposons un nouveau modèle de programmation linéaire en nombres entiers (ILP - Integer Linear Programming) au problème RWA afin de maximiser le nombre de requêtes acceptées (GoS - Grade of Service). Le modèle résultant constitue celui de l’optimisation d’un ILP de grande taille, ce qui permet d’obtenir la solution exacte des instances RWA assez grandes, en supposant que tous les noeuds soient asymétriques et accompagnés d’une matrice de connectivité de commutation donnée. Ensuite, nous modifions le modèle et proposons une solution au problème RWA afin de trouver la meilleure matrice de commutation pour un nombre donné de ports et de connexions de commutation, tout en satisfaisant/maximisant la qualité d’écoulement du trafic GoS. Relativement à la protection des réseaux d’un domaine simple, nous proposons des solutions favorisant la protection contre les pannes multiples. En effet, nous développons la protection d’un réseau d’un domaine simple contre des pannes multiples, en utilisant les p-cycles de protection avec un chemin indépendant des pannes (FIPP - Failure Independent Path Protecting) et de la protection avec un chemin dépendant des pannes (FDPP - Failure Dependent Path-Protecting). Nous proposons ensuite une nouvelle formulation en termes de modèles de flots pour les p-cycles FDPP soumis à des pannes multiples. Le nouveau modèle soulève un problème de taille, qui a un nombre exponentiel de contraintes en raison de certaines contraintes d’élimination de sous-tour. Par conséquent, afin de résoudre efficacement ce problème, on examine : (i) une décomposition hiérarchique du problème auxiliaire dans le modèle de décomposition, (ii) des heuristiques pour gérer efficacement le grand nombre de contraintes. À propos de la protection dans les réseaux multidomaines, nous proposons des systèmes de protection contre les pannes d’un lien. Tout d’abord, un modèle d’optimisation est proposé pour un système de protection centralisée, en supposant que la gestion du réseau soit au courant de tous les détails des topologies physiques des domaines. Nous proposons ensuite un modèle distribué de l’optimisation de la protection dans les réseaux optiques multidomaines, une formulation beaucoup plus réaliste car elle est basée sur l’hypothèse d’une gestion de réseau distribué. Ensuite, nous ajoutons une bande pasiv sante partagée afin de réduire le coût de la protection. Plus précisément, la bande passante de chaque lien intra-domaine est partagée entre les p-cycles FIPP et les p-cycles dans une première étude, puis entre les chemins pour lien/chemin de protection dans une deuxième étude. Enfin, nous recommandons des stratégies parallèles aux solutions de grands réseaux optiques multidomaines. Les résultats de l’étude permettent d’élaborer une conception efficace d’un système de protection pour un très large réseau multidomaine (45 domaines), le plus large examiné dans la littérature, avec un système à la fois centralisé et distribué.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This research quantitatively evaluates the water retention capacity and flood control function of the forest catchments by using hydrological data of the large flood events which happened after the serious droughts. The objective sites are the Oodo Dam and the Sameura Dam catchments in Japan. The kinematic wave model, which considers saturated and unsaturated sub-surface soil zones, is used for the rainfall-runoff analysis. The result shows that possible storage volume of the Oodo Dam catchment is 162.26 MCM in 2005, while that of Samerua is 102.83 MCM in 2005 and 102.64 MCM in 2007. Flood control function of the Oodo Dam catchment is 173 mm in water depth in 2005, while the Sameura Dam catchment 114 mm in 2005 and 126 mm in 2007. This indicates that the Oodo Dam catchment has more than twice as big water capacity as its capacity (78.4 mm), while the Sameura Dam catchment has about one-fifth of the its storage capacity (693 mm).