870 resultados para Time Based Management (TBM)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Vitamin D deficiency is prevalent in HIV-infected individuals and vitamin D supplementation is proposed according to standard care. This study aimed at characterizing the kinetics of 25(OH)D in a cohort of HIV-infected individuals of European ancestry to better define the influence of genetic and non-genetic factors on 25(OH)D levels. These data were used for the optimization of vitamin D supplementation in order to reach therapeutic targets. METHODS: 1,397 25(OH)D plasma levels and relevant clinical information were collected in 664 participants during medical routine follow-up visits. They were genotyped for 7 SNPs in 4 genes known to be associated with 25(OH)D levels. 25(OH)D concentrations were analysed using a population pharmacokinetic approach. The percentage of individuals with 25(OH)D concentrations within the recommended range of 20-40 ng/ml during 12 months of follow-up and several dosage regimens were evaluated by simulation. RESULTS: A one-compartment model with linear absorption and elimination was used to describe 25(OH)D pharmacokinetics, while integrating endogenous baseline plasma concentrations. Covariate analyses confirmed the effect of seasonality, body mass index, smoking habits, the analytical method, darunavir/ritonavir and the genetic variant in GC (rs2282679) on 25(OH)D concentrations. 11% of the inter-individual variability in 25(OH)D levels was explained by seasonality and other non-genetic covariates, and 1% by genetics. The optimal supplementation for severe vitamin D deficient patients was 300,000 IU two times per year. CONCLUSIONS: This analysis allowed identifying factors associated with 25(OH)D plasma levels in HIV-infected individuals. Improvement of dosage regimen and timing of vitamin D supplementation is proposed based on those results.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Recent neuroimaging studies suggest that value-based decision-making may rely on mechanisms of evidence accumulation. However no studies have explicitly investigated the time when single decisions are taken based on such an accumulation process. NEW METHOD: Here, we outline a novel electroencephalography (EEG) decoding technique which is based on accumulating the probability of appearance of prototypical voltage topographies and can be used for predicting subjects' decisions. We use this approach for studying the time-course of single decisions, during a task where subjects were asked to compare reward vs. loss points for accepting or rejecting offers. RESULTS: We show that based on this new method, we can accurately decode decisions for the majority of the subjects. The typical time-period for accurate decoding was modulated by task difficulty on a trial-by-trial basis. Typical latencies of when decisions are made were detected at ∼500ms for 'easy' vs. ∼700ms for 'hard' decisions, well before subjects' response (∼340ms). Importantly, this decision time correlated with the drift rates of a diffusion model, evaluated independently at the behavioral level. COMPARISON WITH EXISTING METHOD(S): We compare the performance of our algorithm with logistic regression and support vector machine and show that we obtain significant results for a higher number of subjects than with these two approaches. We also carry out analyses at the average event-related potential level, for comparison with previous studies on decision-making. CONCLUSIONS: We present a novel approach for studying the timing of value-based decision-making, by accumulating patterns of topographic EEG activity at single-trial level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Diabetes represents an increasing health burden worldwide. In 2010, the Public Health Department of the canton of Vaud (Switzerland) launched a regional diabetes programme entitled "Programme cantonal Diabète" (PcD), with the objectives to both decrease the incidence of diabetes and improve care for patients with diabetes. The cohort entitled CoDiab-VD emerged from that programme. It specifically aimed at following quality of diabetes care over time, at evaluating the coverage of the PcD within this canton and at assessing the impact of the PcD on care of patients with diabetes. METHODS/DESIGN: The cohort CoDiab-VD is a prospective population-based cohort study. Patients with diabetes were recruited in two waves (autumn 2011--summer 2012) through community pharmacies. Eligible participants were non-institutionalised adult patients (≥ 18 years) with diabetes diagnosed for at least one year, residing in the canton of Vaud and coming to a participating pharmacy with a diabetes-related prescription. Women with gestational diabetes, people with obvious cognitive impairment or insufficient command of French were not eligible. Self-reported data collected, included the following primary outcomes: processes-of-care indicators (annual checks) and outcomes of care such as HbA1C, (health-related) quality of life measures (Short Form-12 Health Survey--SF-12, Audit of Diabetes-Dependent Quality of Life 19--ADDQoL) and Patient Assessment of Chronic Illness Care (PACIC). Data on diabetes, health status, healthcare utilisation, health behaviour, self-management activities and support, knowledge of, or participation to, campaigns/activities proposed by the PcD, and socio-demographics were also obtained. For consenting participants, physicians provided few additional pieces of information about processes and laboratory results. Participants will be followed once a year, via a mailed self-report questionnaire. The core of the follow-up questionnaires will be similar to the baseline one, with the addition of thematic modules adapting to the development of the PcD. Physicians will be contacted every 2 years. DISCUSSION: CoDiab-VD will allow obtaining a broad picture of the care of patients with diabetes, as well as their needs regarding their chronic condition. The data will be used to evaluate the PcD and help prioritise targeted actions. TRIAL REGISTRATION: This study is registered with ClinicalTrials.gov, identifier NCT01902043, July 9, 2013.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The time required to image large samples is an important limiting factor in SPM-based systems. In multiprobe setups, especially when working with biological samples, this drawback can make impossible to conduct certain experiments. In this work, we present a feedfordward controller based on bang-bang and adaptive controls. The controls are based in the difference between the maximum speeds that can be used for imaging depending on the flatness of the sample zone. Topographic images of Escherichia coli bacteria samples were acquired using the implemented controllers. Results show that to go faster in the flat zones, rather than using a constant scanning speed for the whole image, speeds up the imaging process of large samples by up to a 4x factor.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

AIMS: We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS: We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION: A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the current state and development of a prototype web-GIS (Geographic Information System) decision support platform intended for application in natural hazards and risk management, mainly for floods and landslides. This web platform uses open-source geospatial software and technologies, particularly the Boundless (formerly OpenGeo) framework and its client side software development kit (SDK). The main purpose of the platform is to assist the experts and stakeholders in the decision-making process for evaluation and selection of different risk management strategies through an interactive participation approach, integrating web-GIS interface with decision support tool based on a compromise programming approach. The access rights and functionality of the platform are varied depending on the roles and responsibilities of stakeholders in managing the risk. The application of the prototype platform is demonstrated based on an example case study site: Malborghetto Valbruna municipality of North-Eastern Italy where flash floods and landslides are frequent with major events having occurred in 2003. The preliminary feedback collected from the stakeholders in the region is discussed to understand the perspectives of stakeholders on the proposed prototype platform.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The primary objective is to identify the critical factors that have a natural impact on the performance measurement system. It is important to make correct decisions related to measurement systems, which are based on the complex business environment. The performance measurement system is combined with a very complex non-linear factor. The Six Sigma methodology is seen as one potential approach at every organisational level. It will be linked to the performance and financial measurement as well as to the analytical thinking on which the viewpoint of management depends. The complex systems are connected to the customer relationship study. As the primary throughput can be seen in a new well-defined performance measurement structure that will also be facilitated as will an analytical multifactor system. These critical factors should also be seen as a business innovation opportunity at the same time. This master's thesis has been divided into two different theoretical parts. The empirical part consists of both action-oriented and constructive research approaches with an empirical case study. The secondary objective is to seek a competitive advantage factor with a new analytical tool and the Six Sigma thinking. Process and product capabilities will be linked to the contribution of complex system. These critical barriers will be identified by the performance measuring system. The secondary throughput can be recognised as the product and the process cost efficiencies which throughputs are achieved with an advantage of management. The performance measurement potential is related to the different productivity analysis. Productivity can be seen as one essential part of the competitive advantage factor.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Data from prospective cohorts describing dyslipidaemia prevalence and treatment trends are lacking. Using data from the prospective CoLaus study, we aimed to examine changes in serum lipid levels, dyslipidaemia prevalence and management in a population-based sample of Swiss adults. METHODS AND RESULTS: Cardiovascular risk was assessed using PROCAM. Dyslipidaemia and low-density lipoprotein cholesterol (LDL-C) target levels were defined according to the Swiss Group for Lipids and Atherosclerosis. Complete baseline and follow up (FU) data were available for n = 4863 subjects during mean FU time of 5.6 years. Overall, 32.1% of participants were dyslipidaemic at baseline vs 46.3% at FU (p < 0.001). During this time, lipid lowering medication (LLM) rates among dyslipidaemic subjects increased from 34.0% to 39.2% (p < 0.001). In secondary prevention, LLM rates were 42.7% at baseline and 53.2% at FU (p = 0.004). In multivariate analysis, LLM use among dyslipidaemic subjects, between baseline and FU, was positively associated with personal history of CVD, older age, hypertension, higher BMI and diabetes, while negatively associated with higher educational level. Among treated subjects, LDL-C target achievement was positively associated with diabetes and negatively associated with personal history of CVD and higher BMI. Among subjects treated at baseline, LLM discontinuation was negatively associated with older age, male sex, smoking, hypertension and parental history of CVD. CONCLUSIONS: In Switzerland, the increase over time in dyslipidaemia prevalence was not paralleled by a similar increase in LLM. In a real-life setting, dyslipidaemia management remains far from optimal, both in primary and secondary prevention.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

INTRODUCTION: Dispatch-assisted cardiopulmonary resuscitation (DA-CPR) plays a key role in out-of-hospital cardiac arrests. We sought to measure dispatchers' performances in a criteria-based system in recognizing cardiac arrest and delivering DA-CPR. Our secondary purpose was to identify the factors that hampered dispatchers' identification of cardiac arrests, the factors that prevented them from proposing DA-CPR, and the factors that prevented bystanders from performing CPR. METHODS AND RESULTS: We reviewed dispatch recordings for 1254 out-of-hospital cardiac arrests occurring between January 1, 2011 and December 31, 2013. Dispatchers correctly identified cardiac arrests in 71% of the reviewed cases and 84% of the cases in which they were able to assess for patient consciousness and breathing. The median time to recognition of the arrest was 60s. The median time to start chest compression was 220s. CONCLUSIONS: This study demonstrates that performances from a criteria-based dispatch system can be similar to those from a medical-priority dispatch system regarding out-of-hospital cardiac arrest (OHCA) time recognition and DA-CPR delivery. Agonal breathing recognition remains the weakest link in this sensitive task in both systems. It is of prime importance that all dispatch centers tend not only to implement DA-CPR but also to have tools to help them reach this objective, as today it should be mandatory to offer this service to the community. In order to improve benchmarking opportunities, we completed previously proposed performance standards as propositions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purchasing and supply management (PSM) has become increasingly important for companies to survive in current highly competitive market. Increased outsourcing has extended the role of PSM, making external resource management and supplier relationships critical success factors in business. However, the recent research has mainly concentrated on large enterprises. Therefore the PSM issues related to medium-sized enterprises represent a significant research area. The thesis aims to explore the status and role of PSM in Finnish medium-sized firms, understand how strategic companies consider PSM to be, clarify what are the competence requirements for PSM professionals, and increase the understanding of PSM capabilities needed from the points view of individual competence and organisational capabilities. The study uses data that was collected in 2007 from purchasing executives at the director/CEO level representing a sample of 94 Finnish firms. 54 % of the respondent enterprises had a supply strategy. The total supply cost was on average 60 % of firms' turnover. Centralisation of PSM and outsourcing of logistics will increase in Finnish medium-sized enterprises. The findings point out that Finnish medium-sized enterprises had strategical features of PSM. However, Finnish firms have not concentrated on making strategies that relate to PSM. The elements that explain the existence of a supply strategy could be found in this study. It can be concluded from this study that there is an advantageous base for the development of strategic PSM, because nearly all the enterprises were of the opinion that PSM capabilities have an effect on business success. When reviewing the organisational capabilities, the five most important development elements were supplier relationships, both operational and strategic processes, time management, and personnel's competence. Training in internationalisation, strategic management, and communication could help to improve competences of PSM personnel.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coastal birds are an integral part of coastal ecosystems, which nowadays are subject to severe environmental pressures. Effective measures for the management and conservation of seabirds and their habitats call for insight into their population processes and the factors affecting their distribution and abundance. Central to national and international management and conservation measures is the availability of accurate data and information on bird populations, as well as on environmental trends and on measures taken to solve environmental problems. In this thesis I address different aspects of the occurrence, abundance, population trends and breeding success of waterbirds breeding on the Finnish coast of the Baltic Sea, and discuss the implications of the results for seabird monitoring, management and conservation. In addition, I assess the position and prospects of coastal bird monitoring data, in the processing and dissemination of biodiversity data and information in accordance with the Convention on Biological Diversity (CBD) and other national and international commitments. I show that important factors for seabird habitat selection are island area and elevation, water depth, shore openness, and the composition of island cover habitats. Habitat preferences are species-specific, with certain similarities within species groups. The occurrence of the colonial Arctic Tern (Sterna paradisaea) is partly affected by different habitat characteristics than its abundance. Using long-term bird monitoring data, I show that eutrophication and winter severity have reduced the populations of several Finnish seabird species. A major demographic factor through which environmental changes influence bird populations is breeding success. Breeding success can function as a more rapid indicator of sublethal environmental impacts than population trends, particularly for long-lived and slowbreeding species, and should therefore be included in coastal bird monitoring schemes. Among my target species, local breeding success can be shown to affect the populations of the Mallard (Anas platyrhynchos), the Eider (Somateria mollissima) and the Goosander (Mergus merganser) after a time lag corresponding to their species-specific recruitment age. For some of the target species, the number of individuals in late summer can be used as an easier and more cost-effective indicator of breeding success than brood counts. My results highlight that the interpretation and application of habitat and population studies require solid background knowledge of the ecology of the target species. In addition, the special characteristics of coastal birds, their habitats, and coastal bird monitoring data have to be considered in the assessment of their distribution and population trends. According to the results, the relationships between the occurrence, abundance and population trends of coastal birds and environmental factors can be quantitatively assessed using multivariate modelling and model selection. Spatial data sets widely available in Finland can be utilised in the calculation of several variables that are relevant to the habitat selection of Finnish coastal species. Concerning some habitat characteristics field work is still required, due to a lack of remotely sensed data or the low resolution of readily available data in relation to the fine scale of the habitat patches in the archipelago. While long-term data sets exist for water quality and weather, the lack of data concerning for instance the food resources of birds hampers more detailed studies of environmental effects on bird populations. Intensive studies of coastal bird species in different archipelago areas should be encouraged. The provision and free delivery of high-quality coastal data concerning bird populations and their habitats would greatly increase the capability of ecological modelling, as well as the management and conservation of coastal environments and communities. International initiatives that promote open spatial data infrastructures and sharing are therefore highly regarded. To function effectively, international information networks, such as the biodiversity Clearing House Mechanism (CHM) under the CBD, need to be rooted at regional and local levels. Attention should also be paid to the processing of data for higher levels of the information hierarchy, so that data are synthesized and developed into high-quality knowledge applicable to management and conservation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Real-time predictions are an indispensable requirement for traffic management in order to be able to evaluate the effects of different available strategies or policies. The combination of predicting the state of the network and the evaluation of different traffic management strategies in the short term future allows system managers to anticipate the effects of traffic control strategies ahead of time in order to mitigate the effect of congestion. This paper presents the current framework of decision support systems for traffic management based on short and medium-term predictions and includes some reflections on their likely evolution, based on current scientific research and the evolution of the availability of new types of data and their associated methodologies.