29 resultados para Traffic estimation.


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research has been prompted by an interest in the atmospheric processes of hydrogen. The sources and sinks of hydrogen are important to know, particularly if hydrogen becomes more common as a replacement for fossil fuel in combustion. Hydrogen deposition velocities (vd) were estimated by applying chamber measurements, a radon tracer method and a two-dimensional model. These three approaches were compared with each other to discover the factors affecting the soil uptake rate. A static-closed chamber technique was introduced to determine the hydrogen deposition velocity values in an urban park in Helsinki, and at a rural site at Loppi. A three-day chamber campaign to carry out soil uptake estimation was held at a remote site at Pallas in 2007 and 2008. The atmospheric mixing ratio of molecular hydrogen has also been measured by a continuous method in Helsinki in 2007 - 2008 and at Pallas from 2006 onwards. The mean vd values measured in the chamber experiments in Helsinki and Loppi were between 0.0 and 0.7 mm s-1. The ranges of the results with the radon tracer method and the two-dimensional model were 0.13 - 0.93 mm s-1 and 0.12 - 0.61 mm s-1, respectively, in Helsinki. The vd values in the three-day campaign at Pallas were 0.06 - 0.52 mm s-1 (chamber) and 0.18 - 0.52 mm s-1 (radon tracer method and two-dimensional model). At Kumpula, the radon tracer method and the chamber measurements produced higher vd values than the two-dimensional model. The results of all three methods were close to each other between November and April, except for the chamber results from January to March, while the soil was frozen. The hydrogen deposition velocity values of all three methods were compared with one-week cumulative rain sums. Precipitation increases the soil moisture, which decreases the soil uptake rate. The measurements made in snow seasons showed that a thick snow layer also hindered gas diffusion, lowering the vd values. The H2 vd values were compared to the snow depth. A decaying exponential fit was obtained as a result. During a prolonged drought in summer 2006, soil moisture values were lower than in other summer months between 2005 and 2008. Such conditions were prevailing in summer 2006 when high chamber vd values were measured. The mixing ratio of molecular hydrogen has a seasonal variation. The lowest atmospheric mixing ratios were found in the late autumn when high deposition velocity values were still being measured. The carbon monoxide (CO) mixing ratio was also measured. Hydrogen and carbon monoxide are highly correlated in an urban environment, due to the emissions originating from traffic. After correction for the soil deposition of H2, the slope was 0.49±0.07 ppb (H2) / ppb (CO). Using the corrected hydrogen-to-carbon-monoxide ratio, the total hydrogen load emitted by Helsinki traffic in 2007 was 261 t (H2) a-1. Hydrogen, methane and carbon monoxide are connected with each other through the atmospheric methane oxidation process, in which formaldehyde is produced as an important intermediate. The photochemical degradation of formaldehyde produces hydrogen and carbon monoxide as end products. Examination of back-trajectories revealed long-range transportation of carbon monoxide and methane. The trajectories can be grouped by applying cluster and source analysis methods. Thus natural and anthropogenic emission sources can be separated by analyzing trajectory clusters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examines the properties of Generalised Regression (GREG) estimators for domain class frequencies and proportions. The family of GREG estimators forms the class of design-based model-assisted estimators. All GREG estimators utilise auxiliary information via modelling. The classic GREG estimator with a linear fixed effects assisting model (GREG-lin) is one example. But when estimating class frequencies, the study variable is binary or polytomous. Therefore logistic-type assisting models (e.g. logistic or probit model) should be preferred over the linear one. However, other GREG estimators than GREG-lin are rarely used, and knowledge about their properties is limited. This study examines the properties of L-GREG estimators, which are GREG estimators with fixed-effects logistic-type models. Three research questions are addressed. First, I study whether and when L-GREG estimators are more accurate than GREG-lin. Theoretical results and Monte Carlo experiments which cover both equal and unequal probability sampling designs and a wide variety of model formulations show that in standard situations, the difference between L-GREG and GREG-lin is small. But in the case of a strong assisting model, two interesting situations arise: if the domain sample size is reasonably large, L-GREG is more accurate than GREG-lin, and if the domain sample size is very small, estimation of assisting model parameters may be inaccurate, resulting in bias for L-GREG. Second, I study variance estimation for the L-GREG estimators. The standard variance estimator (S) for all GREG estimators resembles the Sen-Yates-Grundy variance estimator, but it is a double sum of prediction errors, not of the observed values of the study variable. Monte Carlo experiments show that S underestimates the variance of L-GREG especially if the domain sample size is minor, or if the assisting model is strong. Third, since the standard variance estimator S often fails for the L-GREG estimators, I propose a new augmented variance estimator (A). The difference between S and the new estimator A is that the latter takes into account the difference between the sample fit model and the census fit model. In Monte Carlo experiments, the new estimator A outperformed the standard estimator S in terms of bias, root mean square error and coverage rate. Thus the new estimator provides a good alternative to the standard estimator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper estimates the extent of income underreporting by the self-employed in Finland using the expenditure based approach developed by Pissarides & Weber (1989). Household spending data are for the years 1994 to 1996. The results suggest that self-employment income in Finland is underreported by some 27% on average. Since income for the self-employed is about 8 % of all incomes in Finland, the size of this part of the black economy in Finland is estimated to be about 2,3% of GDP.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study contributes to the neglect effect literature by looking at the relative trading volume in terms of value. The results for the Swedish market show a significant positive relationship between the accuracy of estimation and the relative trading volume. Market capitalisation and analyst coverage have in prior studies been used as proxies for neglect. These measures however, do not take into account the effort analysts put in when estimating corporate pre-tax profits. I also find evidence that the industry of the firm influence the accuracy of estimation. In addition, supporting earlier findings, loss making firms are associated with larger forecasting errors. Further, I find that the average forecast error increased in the year 2000 – in Sweden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study presents a population projection for Namibia for years 2011–2020. In many countries of sub-Saharan Africa, including Namibia, the population growth is still continuing even though the fertility rates have declined. However, many of these countries suffer from a large HIV epidemic that is slowing down the population growth. In Namibia, the epidemic has been severe. Therefore, it is important to assess the effect of HIV/AIDS on the population of Namibia in the future. Demographic research on Namibia has not been very extensive, and data on population is not widely available. According to the studies made, fertility has been shown to be generally declining and mortality has been significantly increasing due to AIDS. Previous population projections predict population growth for Namibia in the near future, yet HIV/AIDS is affecting the future population developments. For the projection constructed in this study, data on population is taken from the two most recent censuses, from 1991 and 2001. Data on HIV is available from HIV Sentinel Surveys 1992–2008, which test pregnant women for HIV in antenatal clinics. Additional data are collected from different sources and recent studies. The projection is made with software (EPP and Spectrum) specially designed for developing countries with scarce data. The projection includes two main scenarios which have different assumptions concerning the development of the HIV epidemic. In addition, two hypothetical scenarios are made: the first considering the case where HIV epidemic would never have existed and the second considering the case where HIV treatment would never have existed. The results indicate population growth for Namibia. Population in the 2001 census was 1.83 million and is projected to result in 2.38/2.39 million in 2020 in the first two scenarios. Without HIV, population would be 2.61 million and without treatment 2.30 million in 2020. Urban population is growing faster than rural. Even though AIDS is increasing mortality, the past high fertility rates still keep young adult age groups quite large. The HIV epidemic shows to be slowing down, but it is still increasing the mortality of the working-aged population. The initiation of HIV treatment in 2004 in the public sector seems to have had an effect on many projected indicators, diminishing the impact of HIV on the population. For example, the rise of mortality is slowing down.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tämän pro gradu -tutkielman tarkoituksena on määrittää jätteenkeräyksen ja -siirron yhteiskunnalliset kustannukset valitulla tutkimusalueella Helsingin Punavuoressa. Jätteenkeräyksen ja -siirron kustannukset vastaavat suuruudeltaan merkittävää osaa jätehuollon kokonaiskustannuksista, minkä vuoksi kustannusten tutkimiselle ja tarkastelulle löytyy kysyntää. Lisäksi keräyksen ja siirron kustannukset saattavat vaihdella suuresti johtuen erilaisista kaupunkirakenteista,keräysmenetelmistä ja teknologioista, joten tapaustarkastelun avulla pystytään selvittämään yksityiskohtaisesti alueen jätteenkeräyksen ja -siirron kustannukset. Tutkimusalue Helsingin Punavuoressa on yksi Suomen tiheimmin asutuista alueista, missä jätteidenkeräystä hankaloittaa kapeat kadut, useat sisäpihoille sijoitetut jätehuoneet ja vilkas liikenne. Erityispiirteidensä vuoksi jätteenkeräys- ja siirto aiheuttaa tutkimusalueella yksityisten kustannusten lisäksi myös useita ulkoisvaikutuksia muun muassa ilmansaasteiden ja viihtyvyyshaittojen muodossa. Tässä työssä lasketaan jätteenkeräyksen ja -siirron yhteiskunnalliset kustannukset neljän eri jätelajin osalta huomioimalla sekä yksityiset kustannustekijät että ulkoiskustannuksina syntyvien päästöjen kustannukset. Työn aineistona on käytetty erilaisia kustannuslaskelmien kirjallisuuslähteitä, asiantuntija-arvioita ja tutkimusalueella tehtyjä kellotusmittauksia. Alueen kellotusmittauksiin perustuvalla aikaperusteisella laskentatavalla jätteenkeräyksen ja -siirron jätetonnikohtaisiksi keskimääräisiksi kustannuksiksi saatiin 73 €/t. Kustannuksissa havaittiin kuitenkin suuria jätelajikohtaisia eroja, jolloin keräyksen ja siirron kustannukset heittelivät 49–125 €/t välillä. Suuret jätelajikohtaiset kustannuserot ovat selitettävissä pitkälti jätteiden koostumuksella, koska kevyiden ja paljon tilaa vievien jätelajien jätetonnikohtaiset kustannukset olivat suurimpia. Teoriataustan ja lähdeaineiston perusteella saadut tulokset myös osoittavat, että jätteenkeräyksen ja siirron kustannuksista huomioitujen ulkoiskustannusten osuus on häviävän pieni verrattuna yksityisten kustannusten tasoon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two methods of pre-harvest inventory were designed and tested on three cutting sites containing a total of 197 500 m3 of wood. These sites were located on flat-ground boreal forests located in northwestern Quebec. Both methods studied involved scaling of trees harvested to clear the road path one year (or more) prior to harvest of adjacent cut-blocks. The first method (ROAD) considers the total road right-of-way volume divided by the total road area cleared. The resulting volume per hectare is then multiplied by the total cut-block area scheduled for harvest during the following year to obtain the total estimated cutting volume. The second method (STRATIFIED) also involves scaling of trees cleared from the road. However, in STRATIFIED, log scaling data are stratified by forest stand location. A volume per hectare is calculated for each stretch of road that crosses a single forest stand. This volume per hectare is then multiplied by the remaining area of the same forest stand scheduled for harvest one year later. The sum of all resulting estimated volumes per stand gives the total estimated cutting-volume for all cut-blocks adjacent to the studied road. A third method (MNR) was also used to estimate cut-volumes of the sites studied. This method represents the actual existing technique for estimating cutting volume in the province of Quebec. It involves summing the cut volume for all forest stands. The cut volume is estimated by multiplying the area of each stand by its estimated volume per hectare obtained from standard stock tables provided by the governement. The resulting total estimated volume per cut-block for all three methods was then compared with the actual measured cut-block volume (MEASURED). This analysis revealed a significant difference between MEASURED and MNR methods with the MNR volume estimate being 30 % higher than MEASURED. However, no significant difference from MEASURED was observed for volume estimates for the ROAD and STRATIFIED methods which respectively had estimated cutting volumes 19 % and 5 % lower than MEASURED. Thus the ROAD and STRATIFIED methods are good ways to estimate cut-block volumes after road right-of-way harvest for conditions similar to those examined in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The majority of Internet traffic use Transmission Control Protocol (TCP) as the transport level protocol. It provides a reliable ordered byte stream for the applications. However, applications such as live video streaming place an emphasis on timeliness over reliability. Also a smooth sending rate can be desirable over sharp changes in the sending rate. For these applications TCP is not necessarily suitable. Rate control attempts to address the demands of these applications. An important design feature in all rate control mechanisms is TCP friendliness. We should not negatively impact TCP performance since it is still the dominant protocol. Rate Control mechanisms are classified into two different mechanisms: window-based mechanisms and rate-based mechanisms. Window-based mechanisms increase their sending rate after a successful transfer of a window of packets similar to TCP. They typically decrease their sending rate sharply after a packet loss. Rate-based solutions control their sending rate in some other way. A large subset of rate-based solutions are called equation-based solutions. Equation-based solutions have a control equation which provides an allowed sending rate. Typically these rate-based solutions react slower to both packet losses and increases in available bandwidth making their sending rate smoother than that of window-based solutions. This report contains a survey of rate control mechanisms and a discussion of their relative strengths and weaknesses. A section is dedicated to a discussion on the enhancements in wireless environments. Another topic in the report is bandwidth estimation. Bandwidth estimation is divided into capacity estimation and available bandwidth estimation. We describe techniques that enable the calculation of a fair sending rate that can be used to create novel rate control mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In lake-rich regions, the gathering of information about water quality is challenging because only a small proportion of the lakes can be assessed each year by conventional methods. One of the techniques for improving the spatial and temporal representativeness of lake monitoring is remote sensing from satellites and aircrafts. The experimental material included detailed optical measurements in 11 lakes, air- and spaceborne remote sensing measurements with concurrent field sampling, automatic raft measurements and a national dataset of routine water quality measurements from over 1100 lakes. The analyses of the spatially high-resolution airborne remote sensing data from eutrophic and mesotrophic lakes showed that one or a few discrete water quality observations using conventional monitoring can yield a clear over- or underestimation of the overall water quality in a lake. The use of TM-type satellite instruments in addition to routine monitoring results substantially increases the number of lakes for which water quality information can be obtained. The preliminary results indicated that coloured dissolved organic matter (CDOM) can be estimated with TM-type satellite instruments, which could possibly be utilised as an aid in estimating the role of lakes in global carbon budgets. Based on the results of reflectance modelling and experimental data, MERIS satellite instrument has optimal or near-optimal channels for the estimation of turbidity, chlorophyll a and CDOM in Finnish lakes. MERIS images with a 300 m spatial resolution can provide water quality information in different parts of large and medium-sized lakes, and in filling in the gaps resulting from conventional monitoring. Algorithms that would not require simultaneous field data for algorithm training would increase the amount of remote sensing-based information available for lake monitoring. The MERIS Boreal Lakes processor, trained with the optical data and concentration ranges provided by this study, enabled turbidity estimations with good accuracy without the need for algorithm correction with field measurements, while chlorophyll a and CDOM estimations require further development of the processor. The accuracy of interpreting chlorophyll a via semi empirical algorithms can be improved by classifying lakes prior to interpretation according to their CDOM level and trophic status. Optical modelling indicated that the spectral diffuse attenuation coefficient can be estimated with reasonable accuracy from the measured water quality concentrations. This provides more detailed information on light attenuation from routine monitoring measurements than is available through the Secchi disk transparency. The results of this study improve the interpretation of lake water quality by remote sensing and encourage the use of remote sensing in lake monitoring.