997 resultados para Traffic Estimation
Resumo:
The degree of fusion at the anterior aspect of the sacral vertebrae has been scored in 242 male and female skeletons from the Lisbon documented collection, ranging in age from 16 to 59 years old. Statistical tests indicate a sex difference towards earlier fusion in young females compared with young males, as well as a clear association between degree of fusion and age. Similar results have been found in documented skeletal samples from Coimbra and Sassari, and the recommendations stated by these authors regarding age estimation have been positively tested in the Lisbon collection. Although more research from geographically diverse samples is required, a general picture of the pattern of sacral fusion and its associations with age and sex is emerging. We also provide a practical example of the usefulness of the sacrum in age estimation in a forensic setting, a mass grave from the Spanish Civil War. It is concluded that the scoring of the degree of fusion of the sacral vertebrae, specially of S1-2, can be a simple tool for assigning skeletons to broad age groups, and it should be implemented as another resource for age estimation in the study of human skeletal remains.
Resumo:
Contexte : Parmi les infections nosocomiales, le Staphylocoque méticilline résistant (MRSA) est le germe pathogène le plus couramment identifié dans les hôpitaux du monde entier. La stratégie de contrôle des MRSA au CHUV implique le dépistage des patients à risque. Avec la méthode de dépistage par culture, le temps d'attente est de plusieurs jours. Ceci occasionne des problèmes dans la gestion des flux des patients, principalement à cause des mesures d'isolement. Pour réduire le temps d'attente, l'hôpital envisage d'utiliser une méthode de diagnostic rapide par "polymerase chain reaction" (PCR). Méthodologie : Les données concernant les dépistages réalisés, dans trois services durant l'année 2007, ont été utilisées. Le nombre de jours d'isolement a d'abord été déterminé par patient et par service. Ensuite une analyse des coûts a été effectuée afin d'évaluer la différence des coûts entre les deux méthodes pour chaque service. Résultats : Le principal impact économique de la méthode par PCR dépend principalement du nombre de jours d'isolements évités par rapport à la méthode de culture. Aux services de soins, l'analyse a été menée sur 192 dépistages. Quand la différence de jours d'isolement est de deux jours, le coût des dépistages diminue de plus de 12kCHF et le nombre de jours d'isolement diminue de 384 jours. Au centre interdisciplinaire des urgences, sur 96 dépistages, le gain potentiel avec la méthode PCR est de 6kCHF avec une diminution de 192 jours d'isolement. Aux soins intensifs adultes, la méthode de dépistage par PCR est la méthode la plus rentable avec une diminution des coûts entre 4KCHF et 20K CHF et une diminution des jours d'isolement entre 170 et 310. Pour les trois services analysés, les résultats montrent un rapport coût-efficacité favorable pour la méthode PCR lorsque la diminution des jours d'isolement est supérieure à 1.3 jour. Quand la différence de jours d'isolement est inférieure à 1.3, il faut tenir compte d'autres paramètres, comme le coût de matériel qui doit être supérieur à 45.5 CHF, et du nombre d'analyses par dépistage, qui doit être inférieur à 3, pour que la PCR reste l'alternative la plus intéressante. Conclusions : La méthode par PCR montre des avantages potentiels importants, tant économiques qu'organisationnels qui limitent ou diminuent les contraintes liées à la stratégie de contrôle des MRSA au CHUV. [Auteure, p. 3]
Resumo:
[Table des matières] 1. Pourquoi s'intéresser à l'occupation inappropriée des lits de soins aigus au CHUV ?. - 1.1. Etat des lieux. - 1.1.1. Les chiffres du CHUV. - 1.1.2. La cellule de gestion des flux de patients. - 1.1.3. L'unité de patients en attente de placement. - 1.1.4. La pénurie de lits dans les EMS vaudois. - 1.1.5. Le vieillissement de la population vaudoise. - 1.2. Evidences nationales et internationales. - - 2. Estimation des coûts. - 2.1. Coûts chiffrables. - 2.1.1. Perte financière directe. - 2.1.2. Coûts des transferts pour engorgement. - 2.1.3. Coût d'opportunité. - 2.2. Coûts non chiffrables. - 2.2.1. Patients. - 2.2.2. Personnel médical. - 2.2.3. CHUV. - - 3. Propositions. - 3.1. Prises en charge alternatives. - 3.1.1. Les réseaux intégrés de services aux personnes âgées. - 3.1.2. Les courts séjours gériatriques. - 3.1.3. Autres solutions. - 3.2. Prévention. - 3.2.1. Prévention des chutes. - 3.2.2. La prévention par l'information aux personnes âgées. - 3.2.3. La prévention par l'information à l'ensemble de la population
Resumo:
Real-time predictions are an indispensable requirement for traffic management in order to be able to evaluate the effects of different available strategies or policies. The combination of predicting the state of the network and the evaluation of different traffic management strategies in the short term future allows system managers to anticipate the effects of traffic control strategies ahead of time in order to mitigate the effect of congestion. This paper presents the current framework of decision support systems for traffic management based on short and medium-term predictions and includes some reflections on their likely evolution, based on current scientific research and the evolution of the availability of new types of data and their associated methodologies.
Resumo:
In mathematical modeling the estimation of the model parameters is one of the most common problems. The goal is to seek parameters that fit to the measurements as well as possible. There is always error in the measurements which implies uncertainty to the model estimates. In Bayesian statistics all the unknown quantities are presented as probability distributions. If there is knowledge about parameters beforehand, it can be formulated as a prior distribution. The Bays’ rule combines the prior and the measurements to posterior distribution. Mathematical models are typically nonlinear, to produce statistics for them requires efficient sampling algorithms. In this thesis both Metropolis-Hastings (MH), Adaptive Metropolis (AM) algorithms and Gibbs sampling are introduced. In the thesis different ways to present prior distributions are introduced. The main issue is in the measurement error estimation and how to obtain prior knowledge for variance or covariance. Variance and covariance sampling is combined with the algorithms above. The examples of the hyperprior models are applied to estimation of model parameters and error in an outlier case.
Resumo:
This paper analyses the impact of using different correlation assumptions between lines of business when estimating the risk-based capital reserve, the Solvency Capital Requirement -SCR-, under Solvency II regulations. A case study is presented and the SCR is calculated according to the Standard Model approach. Alternatively, the requirement is then calculated using an Internal Model based on a Monte Carlo simulation of the net underwriting result at a one-year horizon, with copulas being used to model the dependence between lines of business. To address the impact of these model assumptions on the SCR we conduct a sensitivity analysis. We examine changes in the correlation matrix between lines of business and address the choice of copulas. Drawing on aggregate historical data from the Spanish non-life insurance market between 2000 and 2009, we conclude that modifications of the correlation and dependence assumptions have a significant impact on SCR estimation.
Resumo:
During the last few years, the discussion on the marginal social costs of transportation has been active. Applying the externalities as a tool to control transport would fulfil the polluter pays principle and simultaneously create a fair control method between the transport modes. This report presents the results of two calculation algorithms developed to estimate the marginal social costs based on the externalities of air pollution. The first algorithm calculates the future scenarios of sea transport traffic externalities until 2015 in the Gulf of Finland. The second algorithm calculates the externalities of Russian passenger car transit traffic via Finland by taking into account both sea and road transport. The algorithm estimates the ship-originated emissions of carbon dioxide (CO2), nitrogen oxides (NOx), sulphur oxides (SOx), particulates (PM) and the externalities for each year from 2007 to 2015. The total NOx emissions in the Gulf of Finland from the six ship types were almost 75.7 kilotons (Table 5.2) in 2007. The ship types are: passenger (including cruisers and ROPAX vessels), tanker, general cargo, Ro-Ro, container and bulk vessels. Due to the increase of traffic, the estimation for NOx emissions for 2015 is 112 kilotons. The NOx emission estimation for the whole Baltic Sea shipping is 370 kilotons in 2006 (Stipa & al, 2007). The total marginal social costs due to ship-originated CO2, NOx, SOx and PM emissions in the GOF were calculated to almost 175 million Euros in 2007. The costs will increase to nearly 214 million Euros in 2015 due to the traffic growth. The major part of the externalities is due to CO2 emissions. If we neglect the CO2 emissions by extracting the CO2 externalities from the results, we get the total externalities of 57 million Euros in 2007. After eight years (2015), the externalities would be 28 % lower, 41 million Euros (Table 8.1). This is the result of the sulphur emissions reducing regulation of marine fuels. The majority of the new car transit goes through Finland to Russia due to the lack of port capacity in Russia. The amount of cars was 339 620 vehicles (Statistics of Finnish Customs 2008) in 2005. The externalities are calculated for the transportation of passenger vehicles as follows: by ship to a Finnish port and, after that, by trucks to the Russian border checkpoint. The externalities are between 2 – 3 million Euros (year 2000 cost level) for each route. The ports included in the calculations are Hamina, Hanko, Kotka and Turku. With the Euro-3 standard trucks, the port of Hanko would be the best choice to transport the vehicles. This is because of lower emissions by new trucks and the saved transport distance of a ship. If the trucks are more polluting Euro 1 level trucks, the port of Kotka would be the best choice. This indicates that the truck emissions have a considerable effect on the externalities and that the transportation of light cargo, such as passenger cars by ship, produces considerably high emission externalities. The emission externalities approach offers a new insight for valuing the multiple traffic modes. However, the calculation of the marginal social costs based on the air emission externalities should not be regarded as a ready-made calculation system. The system is clearly in the need of some improvement but it can already be considered as a potential tool for political decision making.
Resumo:
Congestion costs are emerging as one of the most important challenges faced by metropolitan planners and transport authorities in first world economies. In US these costs were as high as 78 million dollars in 2005 and are growing due to fast increases in travel delays. In order to solve the current and severe levels of congestion the US department of transportation have recently started a program to initiate congestion pricing in five metropolitan areas. In this context it is important to determine those factors helping its implementation and success, but also the problems or difficulties associated with charging projects. In this article we analyze worldwide experiences with urban road charging in order to extract interesting and helpful lessons for policy makers engaged in congestion pricing projects and for those interested in the introduction of traffic management tools to regulate the entrance to big cities.
Resumo:
The use of tolls is being widespread around the world. Its ability to fund infrastructure projects and to solve budget constraints have been the main rationale behind its renewed interest. However, less attention has been payed to the safety effects derived from this policy in a moment of increasing concern on road fatalities. Pricing best infrastructures shifts some drivers onto worse alternative roads usually not prepared to receive high traffic in comparable safety standards. In this paper we provide evidence of the existence of this perverse consequence by using an international European panel in a two way fixed effects estimation.
Resumo:
Automobile bodily injury (BI) claims remain unsettled for a long time after the accident. The estimation of an accurate reserve for Reported But Not Settled (RBNS) claims is therefore vital for insurers. In accordance with the recommendation included in the Solvency II project (CEIOPS, 2007) a statistical model is here implemented for RBNS reserve estimation. Lognormality on empirical compensation cost data is observed for different levels of BI severity. The individual claim provision is estimated by allocating the expected mean compensation for the predicted severity of the victim’s injury, for which the upper bound is also computed. The BI severity is predicted by means of a heteroscedastic multiple choice model, because empirical evidence has found that the variability in the latent severity of injured individuals travelling by car is not constant. It is shown that this methodology can improve the accuracy of RBNS reserve estimation at all stages, as compared to the subjective assessment that has traditionally been made by practitioners.
Resumo:
Postprint (published version)
Resumo:
Recent developments in optical communications have allowed simpler optical devices to improve network resource utilization. As such, we propose adding a lambda-monitoring device to a wavelength-routing switch (WRS) allowing better performance when traffic is routed and groomed. This device may allow a WRS to aggregate traffic over optical routes without incurring in optical-electrical-optical conversion for the existing traffic. In other words, optical routes can be taken partially to route demands creating a sort of "lighttours". In this paper, we compare the number of OEO conversions needed to route a complete given traffic matrix using either lighttours or lightpaths
Resumo:
In this article, a new technique for grooming low-speed traffic demands into high-speed optical routes is proposed. This enhancement allows a transparent wavelength-routing switch (WRS) to aggregate traffic en route over existing optical routes without incurring expensive optical-electrical-optical (OEO) conversions. This implies that: a) an optical route may be considered as having more than one ingress node (all inline) and, b) traffic demands can partially use optical routes to reach their destination. The proposed optical routes are named "lighttours" since the traffic originating from different sources can be forwarded together in a single optical route, i.e., as taking a "tour" over different sources towards the same destination. The possibility of creating lighttours is the consequence of a novel WRS architecture proposed in this article, named "enhanced grooming" (G+). The ability to groom more traffic in the middle of a lighttour is achieved with the support of a simple optical device named lambda-monitor (previously introduced in the RingO project). In this article, we present the new WRS architecture and its advantages. To compare the advantages of lighttours with respect to classical lightpaths, an integer linear programming (ILP) model is proposed for the well-known multilayer problem: traffic grooming, routing and wavelength assignment The ILP model may be used for several objectives. However, this article focuses on two objectives: maximizing the network throughput, and minimizing the number of optical-electro-optical conversions used. Experiments show that G+ can route all the traffic using only half of the total OEO conversions needed by classical grooming. An heuristic is also proposed, aiming at achieving near optimal results in polynomial time