901 resultados para Installment schedule
Resumo:
The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.
Resumo:
Incremental semantic analysis in a programming environment based on Attribute Grammars is performed by an Incremental Attribute Evaluator (IAE). Current IAEs are either table-driven or make extensive use of graph structures to schedule reevaluation of attributes. A method of compiling an Ordered Attribute Grammar into mutually recursive procedures is proposed. These procedures form an optimal time Incremental Attribute Evaluator for the attribute grammar, which does not require any graphs or tables.
Resumo:
Three different types of consistencies, viz., semiweak, weak, and strong, of a read-only transaction in a schedule s of a set T of transactions are defined and these are compared with the existing notions of consistencies of a read-only transaction in a schedule. We present a technique that enables a user to control the consistency of a read-only transaction in heterogeneous locking protocols. Since the weak consistency of a read-only transaction improves concurrency in heterogeneous locking protocols, the users can help to improve concurrency in heterogeneous locking protocols by supplying the consistency requirements of read-only transactions. A heterogeneous locking protocol P' derived from a locking protocol P that uses exclusive mode locks only and ensures serializability need not be deadlock-free. We present a sufficient condition that ensures the deadlock-freeness of Pprime, when P is deadlock-free and all the read-only transactions in Pprime are two phase.
Resumo:
This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating–dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating–dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs – these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating–dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.
Resumo:
Nonstandard hour child care is a subject rarely studied. From an adult's perspective it is commonly associated with a concern for child's wellbeing. The aim of this study was to view nonstandard hour child care and its everyday routines from children´s perspective. Three research questions were set. The first question dealt with structuring of physical environment and time in a kindergarten providing nonstandard hour child care. The second and third questions handled children s agency and social interaction with adults and peers. The research design was qualitative, and the study was carried out as a case study. Research material was mainly obtained through observation, but interviews, photography and written documents were used as well. The material was analysed by means of content analysis. The study suggests that the physical environment and schedule of a kindergarten providing nonstandard hour child care are similar to those of kindergartens in general. The kindergarten's daily routine enabled children s active agency especially during free play sessions for which there was plenty of time. During free play children were able to interact with both adults and peers. Children s individual day care schedules challenged interaction between children. These special features should be considered in developing and planning nonstandard hour child care. In other word, children's agency and opportunities to social interaction should be kept in mind in organising the environment of early childhood education in kindergartens providing nonstandard hour child care.
Resumo:
We consider the problem of minimizing the total completion time on a single batch processing machine. The set of jobs to be scheduled can be partitioned into a number of families, where all jobs in the same family have the same processing time. The machine can process at most B jobs simultaneously as a batch, and the processing time of a batch is equal to the processing time of the longest job in the batch. We analyze that properties of an optimal schedule and develop a dynamic programming algorithm of polynomial time complexity when the number of job families is fixed. The research is motivated by the problem of scheduling burn-in ovens in the semiconductor industry
Resumo:
The aim of this thesis was to study the crops currently used for biofuel production from the following aspects: 1. what should be the average yield/ ha to reach an energy balance at least 0 or positive 2. what are the shares of the primary and secondary energy flows in agriculture, transport, processing and usage, and 3. overall effects of biofuel crop cultivation, transport, processing and usage. This thesis concentrated on oilseed rape biodiesel and wheat bioethanol in the European Union, comparing them with competing biofuels, such as corn and sugarcane-based ethanol, and the second generation biofuels. The study was executed by comparing Life Cycle Assessment-studies from the EU-region and by analyzing them thoroughly from the differences viewpoint. The variables were the following: energy ratio, hectare yield (l/ha), impact on greenhouse gas emissions (particularly CO2), energy consumption in crop growing and processing one hectare of a particular crop to biofuel, distribution of energy in processing and effects of the secondary energy flows, like e.g. wheat straw. Processing was found to be the most energy consuming part in the production of biofuels. So if the raw materials will remain the same, the development will happen in processing. First generation biodiesel requires esterification, which consumes approximately one third of the process energy. Around 75% of the energy consumed in manufacturing the first generation wheat-based ethanol is spent in steam and electricity generation. No breakthroughs are in sight in the agricultural sector to achieve significantly higher energy ratios. It was found out that even in ideal conditions the energy ratio of first generation wheat-based ethanol will remain slightly under 2. For oilseed rape-based biodiesel the energy ratios are better, and energy consumption per hectare is lower compared to wheat-based ethanol. But both of these are lower compared to e.g. sugarcane-based ethanol. Also the hectare yield of wheat-based ethanol is significantly lower. Biofuels are in a key position when considering the future of the world’s transport sector. Uncertainties concerning biofuels are, however, several, like the schedule of large scale introduction to consumer markets, technologies used, raw materials and their availability and - maybe the biggest - the real production capacity in relation to the fuel consumption. First generation biofuels have not been the expected answer to environmental problems. Comparisons made show that sugarcane-based ethanol is the most prominent first generation biofuel at the moment, both from energy and environment point of view. Also palmoil-based biodiesel looks promising, although it involves environmental concerns as well. From this point of view the biofuels in this study - wheat-based ethanol and oilseed rape-based biodiesel - are not very competitive options. On the other hand, crops currently used for fuel production in different countries are selected based on several factors, not only based on thier relative general superiority. It is challenging to make long-term forecasts for the biofuel sector, but it can be said that satisfying the world's current and near future traffic fuel consumption with biofuels can only be regarded impossible. This does not mean that biofuels shoud be rejected and their positive aspects ignored, but maybe this reality helps us to put them in perspective. To achieve true environmental benefits through the usage of biofuels there must first be a significant drop both in traffic volumes and overall fuel consumption. Second generation biofuels are coming, but serious questions about their availability and production capacities remain open. Therefore nothing can be taken for granted in this issue, expect the need for development.
Resumo:
This paper presents an efficient Simulated Annealing with valid solution mechanism for finding an optimum conflict-free transmission schedule for a broadcast radio network. This is known as a Broadcast Scheduling Problem (BSP) and shown as an NP-complete problem, in earlier studies. Because of this NP-complete nature, earlier studies used genetic algorithms, mean field annealing, neural networks, factor graph and sum product algorithm, and sequential vertex coloring algorithm to obtain the solution. In our study, a valid solution mechanism is included in simulated annealing. Because of this inclusion, we are able to achieve better results even for networks with 100 nodes and 300 links. The results obtained using our methodology is compared with all the other earlier solution methods.
Resumo:
Management of large projects, especially the ones in which a major component of R&D is involved and those requiring knowledge from diverse specialised and sophisticated fields, may be classified as semi-structured problems. In these problems, there is some knowledge about the nature of the work involved, but there are also uncertainties associated with emerging technologies. In order to draw up a plan and schedule of activities of such a large and complex project, the project manager is faced with a host of complex decisions that he has to take, such as, when to start an activity, for how long the activity is likely to continue, etc. An Intelligent Decision Support System (IDSS) which aids the manager in decision making and drawing up a feasible schedule of activities while taking into consideration the constraints of resources and time, will have a considerable impact on the efficient management of the project. This report discusses the design of an IDSS that helps in project planning phase through the scheduling phase. The IDSS uses a new project scheduling tool, the Project Influence Graph (PIG).
Resumo:
Interactive visualization applications benefit from simplification techniques that generate good-quality coarse meshes from high-resolution meshes that represent the domain. These meshes often contain interesting substructures, called embedded structures, and it is desirable to preserve the topology of the embedded structures during simplification, in addition to preserving the topology of the domain. This paper describes a proof that link conditions, proposed earlier, are sufficient to ensure that edge contractions preserve the topology of the embedded structures and the domain. Excluding two specific configurations, the link conditions are also shown to be necessary for topology preservation. Repeated application of edge contraction on an extended complex produces a coarser representation of the domain and the embedded structures. An extension of the quadric error metric is used to schedule edge contractions, resulting in a good-quality coarse mesh that closely approximates the input domain and the embedded structures.
Resumo:
In this paper, we look at the problem of scheduling expression trees with reusable registers on delayed load architectures. Reusable registers come into the picture when the compiler has a data-flow analyzer which is able to estimate the extent of use of the registers. Earlier work considered the same problem without allowing for register variables. Subsequently, Venugopal considered non-reusable registers in the tree. We further extend these efforts to consider a much more general form of the tree. We describe an approximate algorithm for the problem. We formally prove that the code schedule produced by this algorithm will, in the worst case, generate one interlock and use just one more register than that used by the optimal schedule. Spilling is minimized. The approximate algorithm is simple and has linear complexity.
Resumo:
A 1.2 V/1.5 Ah positive-limited nickel/metal hydride cell has been studied to determine its charge-discharge characteristics at different rates in conjunction with its AC impedance data. The faradaic efficiency of the cell is found to be maximum at similar to 70% charge input. The cell has been scaled to a 6 V/1.5 Ah battery. The cycle-life data on the battery suggest that it can sustain a prolonged charge-discharge schedule with little deterioration in its performance.
Resumo:
Background: Duration of seizure by itself is an insufficient criterion for a therapeutically adequate seizure in ECT. Therefore, measures of seizure EEG other than its duration need to be explored as indices of seizure adequacy and predictors of treatment response. We measured the EEG seizure using a geometrical method-fractal dimension (FD) and examined if this measure predicted remission. Methods: Data from an efficacy study on melancholic depressives (n = 40) is used for the present exploration. They received thrice or once weekly ECTs, each schedule at two energy levels - high or low energy level. FD was computed for early-, mid- and post-seizure phases of the ictal EEG. Average of the two channels was used for analysis. Results: Two-thirds of the patients (n = 25) were remitted at the end of 2 weeks. As expected, a significantly higher proportion of patients receiving thrice weekly ECT remitted than in patients receiving once weekly ECT. Smaller post-seizure FD at first ECT is the only variable which predicted remission status after six ECTs. within the once weekly ECT group too, smaller post-seizure FD was associated with remission status. Conclusions: Post-seizure FD is proposed as a novel measure of seizure adequacy and predictor of treatment response. Clinical implications: Seizure measures at first ECT may guide selection of ECT schedule to optimize ECT. Limitations: The study examined short term antidepressant effects only. The results may not be generalized to medication-resistant depressives. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
In this paper, power management algorithms for energy harvesting sensors (EHS) that operate purely based on energy harvested from the environment are proposed. To maintain energy neutrality, EHS nodes schedule their utilization of the harvested power so as to save/draw energy into/from an inefficient battery during peak/low energy harvesting periods, respectively. Under this constraint, one of the key system design goals is to transmit as much data as possible given the energy harvesting profile. For implementational simplicity, it is assumed that the EHS transmits at a constant data rate with power control, when the channel is sufficiently good. By converting the data rate maximization problem into a convex optimization problem, the optimal load scheduling (power management) algorithm that maximizes the average data rate subject to energy neutrality is derived. Also, the energy storage requirements on the battery for implementing the proposed algorithm are calculated. Further, robust schemes that account for the insufficiency of battery storage capacity, or errors in the prediction of the harvested power are proposed. The superior performance of the proposed algorithms over conventional scheduling schemes are demonstrated through computations using numerical data from solar energy harvesting databases.
Resumo:
The integration of different wireless networks, such as GSM and WiFi, as a two-tier hybrid wireless network is more popular and economical. Efficient bandwidth management, call admission control strategies and mobility management are important issues in supporting multiple types of services with different bandwidth requirements in hybrid networks. In particular, bandwidth is a critical commodity because of the type of transactions supported by these hybrid networks, which may have varying bandwidth and time requirements. In this paper, we consider such a problem in a hybrid wireless network installed in a superstore environment and design a bandwidth management algorithm based on the priority level, classification of the incoming transactions. Our scheme uses a downlink transaction scheduling algorithm, which decides how to schedule the outgoing transactions based on their priority level with efficient use of available bandwidth. The transaction scheduling algorithm is used to maximize the number of transaction-executions. The proposed scheme is simulated in a superstore environment with multi Rooms. The performance results describe that the proposed scheme can considerably improve the bandwidth utilization by reducing transaction blocking and accommodating more essential transactions at the peak time of the business.