849 resultados para pigouvian taxes on effort
Resumo:
This analysis examines the gaps in health care financing in Malawi and how foregone taxes could fill these gaps. It begins with an assessment of the disease burden and government health expenditure. Then it analyses the tax revenues foregone by the government of Malawi by two main routes • Illicit financial flows (IFF) from the country • Tax incentives. We find that there are significant financing gaps in the health sector; for example, government expenditure is United States Dollars (USD) 177 million for 2013/2014 while projected donor contribution in 2013/2014 is USD 207 million and the total cost for the minimal health package is USD 535 million. Thus the funding gap between the government budget for health and the required spending to provide the minimal package for 2013/2014 is USD 358 million. On the other hand we estimate that almost USD 400million is lost through IFF and corporate utilization of tax incentives each year. The revenues foregone plus the current government health spending would be sufficient to cover the minimal public health package for all Malawians and would help tackle Malawi’s disease burden. Every effort must be made, including improving transparency and revising laws, to curtail IFF and moderate tax incentives.
Resumo:
In Zimbabwe, the average sweet potato yield (6 t/ha) is relatively low when compared to Asian counterparts (17 t/ha). These low crop yields have been blamed on weevil infestations and viral infections which account for 60-90% of sweet potato yield losses in Africa. Meristem tip culture, a Centre for Potato Improvement (CIP) initiated tissue culture technique, has been widely used to eradicate viruses from clonally propagated crops and has been noted to be one of the instrumental techniques that helped China to increase sweet potato yields. In an effort to adopt the meristem tip culture technique for the production of virus-free planting material of a local sweet potato (cv Brondal), a study was conducted to evaluate the effect of Benzylamino purine (BAP), 1-Naphthaleneacetic acid (NAA) and Gibberellic acid (GA3) (either alone or in combination) on cultured Brondal meristems. The different hormonal treatments were assessed on the following parameters: plantlet regenerative capacity, multiple plantlet production, shoot height, average leaf number per shoot and average node number per shoot, ten weeks after meristem culture. All treatments containing a combination of BAP (1 mg-L) and GA3 (at either 5 mg-L, 10 mg-L, or 20 mg-L) had a significantly (p<0.01) higher plantlet regenerative capacity of 33-66% when compared to other treatment combinations. Only treatments, 10 mg-L GA3 + 1 mg-L BAP and 20 mg-L GA3 + 1 mg-L BAP were capable of inducing multiple plantlet formation, producing an average of three plantlets/meristem and two plantlets/meristem respectively. Overall, treatment 10 mg-L GA3 + 1 mg-L BAP gave rise to significantly (p<0.01) taller shoots (20 mm) compared to the rest of the treatments used. For average leaf number per shoot, all GA3 treatments (5 mg-L, 10 mg-L, or 20 mg-L) supplemented with 1 mg-L BAP gave significantly (p<0.01) higher numbers of leaves (six leaves/shoot) than the rest of the treatments. Treatments 10 mg-L GA3 + 1 mg-L BAP and 20 mg-L GA3 + 1 mg-L BAP gave rise to the highest number of nodes per shoot, producing an average of three nodes per shoot. In sharp contrast to treatments containing a combination of BAP and GA3, all treatments containing a combination of BAP and NAA performed poorly in all parameters tested for plant regeneration of Brondal sweet potato variety. In conclusion, the best hormonal treatment for culturing Brondal meristems proved to be 10 mg-L GA3 + 1 mg-L BAP.
Resumo:
Part 17: Risk Analysis
Resumo:
An economy of effort is a core characteristic of highly skilled motor performance often described as being effortless or automatic. Electroencephalographic (EEG) evaluation of cortical activity in elite performers has consistently revealed a reduction in extraneous associative cortical activity and an enhancement of task-relevant cortical processes. However, this has only been demonstrated under what are essentially practice-like conditions. Recently it has been shown that cerebral cortical activity becomes less efficient when performance occurs in a stressful, complex social environment. This dissertation examines the impact of motor skill training or practice on the EEG cortical dynamics that underlie performance in a stressful, complex social environment. Sixteen ROTC cadets participated in head-to-head pistol shooting competitions before and after completing nine sessions of skill training over three weeks. Spectral power increased in the theta frequency band and decreased in the low alpha frequency band after skill training. EEG Coherence increased in the left frontal region and decreased in the left temporal region after the practice intervention. These suggest a refinement of cerebral cortical dynamics with a reduction of task extraneous processing in the left frontal region and an enhancement of task related processing in the left temporal region consistent with the skill level reached by participants. Partitioning performance into ‘best’ and ‘worst’ based on shot score revealed that deliberate practice appears to optimize cerebral cortical activity of ‘best’ performances which are accompanied by a reduction in task-specific processes reflected by increased high-alpha power, while ‘worst’ performances are characterized by an inappropriate reduction in task-specific processing resulting in a loss of focus reflected by higher high-alpha power after training when compared to ‘best’ performances. Together, these studies demonstrate the power of experience afforded by practice, as a controllable factor, to promote resilience of cerebral cortical efficiency in complex environments.
Resumo:
Doutoramento em Gestão
Resumo:
Current industry proposals for Hardware Transactional Memory (HTM) focus on best-effort solutions (BE-HTM) where hardware limits are imposed on transactions. These designs may show a significant performance degradation due to high contention scenarios and different hardware and operating system limitations that abort transactions, e.g. cache overflows, hardware and software exceptions, etc. To deal with these events and to ensure forward progress, BE-HTM systems usually provide a software fallback path to execute a lock-based version of the code. In this paper, we propose a hardware implementation of an irrevocability mechanism as an alternative to the software fallback path to gain insight into the hardware improvements that could enhance the execution of such a fallback. Our mechanism anticipates the abort that causes the transaction serialization, and stalls other transactions in the system so that transactional work loss is mini- mized. In addition, we evaluate the main software fallback path approaches and propose the use of ticket locks that hold precise information of the number of transactions waiting to enter the fallback. Thus, the separation of transactional and fallback execution can be achieved in a precise manner. The evaluation is carried out using the Simics/GEMS simulator and the complete range of STAMP transactional suite benchmarks. We obtain significant performance benefits of around twice the speedup and an abort reduction of 50% over the software fallback path for a number of benchmarks.
Resumo:
Property taxes serve as a vital revenue source for local governments. The revenues derived from the property tax function as the primary funding source for a variety of critical local public service systems. Property tax appeal systems serve as quasi-administrative-judicial mechanisms intended to assure the public that property tax assessments are correct, fair, and equitable. Despite these important functions, there is a paucity of empirical research related to property tax appeal systems. This study contributes to property tax literature by identifying who participates in the property tax appeal process and examining their motivations for participation. In addition, the study sought to determine whether patterns of use and success in appeal systems affected the distribution of the tax burden. Data were collected by means of a survey distributed to single-family property owners from two Florida counties. In addition, state and county documents were analyzed to determine appeal patterns and examine the impact on assessment uniformity, over a three-year period. The survey data provided contextual evidence that single-family property owners are not as troubled by property taxes as they are by the conduct of local government officials. The analyses of the decision to appeal indicated that more expensive properties and properties excluded from initial uniformity analyses were more likely to be appealed, while properties with homestead exemptions were less likely to be appealed. The value change analyses indicated that appeals are clustered in certain geographical areas; however, these areas do not always experience a greater percentage of the value changes. Interestingly, professional representation did not increase the probability of obtaining a reduction in value. Other relationships between the variables were discovered, but often with weak predictive ability. Findings from the assessment uniformity analyses were also interesting. The results indicated that the appeals mechanisms in both counties improved assessment uniformity. On average, appealed properties exhibited greater horizontal and vertical inequities, as compared to non-appealed properties, prior to the appeals process. After, the appeal process was completed; the indicators of horizontal and vertical equity were largely improved. However, there were some indications of regressivity in the final year of the study.
Resumo:
Our jury system is predicated upon the expectation that jurors engage in systematic processing when considering evidence and making decisions. They are instructed to interpret facts and apply the appropriate law in a fair, dispassionate manner, free of all bias, including that of emotion. However, emotions containing an element of certainty (e.g., anger and happiness, which require little cognitive effort in determining their source) can often lead people to engage in superficial, heuristic-based processing. Compare this to uncertain emotions (e.g., hope and fear, which require people to seek out explanations for their emotional arousal), which instead has the potential to lead them to engage in deeper, more systematic processing. The purpose of the current research is in part to confirm past research (Tiedens & Linton, 2001; Semmler & Brewer, 2002) that uncertain emotions (like fear) can influence decision-making towards a more systematic style of processing, whereas more certain emotional states (like anger) will lead to a more heuristic style of processing. Studies One, Two, and Three build upon this prior research with the goal of improving methodological rigor through the use of film clips to reliably induce emotions, with awareness of testimonial details serving as measures of processing style. The ultimate objective of the current research was to explore this effect in Study Four by inducing either fear, anger, or neutral emotion in mock jurors, half of whom then followed along with a trial transcript featuring eight testimonial inconsistencies, while the other participants followed along with an error-free version of the same transcript. Overall rates of detection for these inconsistencies was expected to be higher for the uncertain/fearful participants due to their more effortful processing compared to certain/angry participants. These expectations were not fulfilled, with significant main effects only for the transcript version (with or without inconsistencies) on overall inconsistency detection rates. There are a number of plausible explanations for these results, so further investigation is needed.
Resumo:
We describe the Joint Effort-Topic (JET) model and the Author Joint Effort-Topic (aJET) model that estimate the effort required for users to contribute on different topics. We propose to learn word-level effort taking into account term preference over time and use it to set the priors of our models. Since there is no gold standard which can be easily built, we evaluate them by measuring their abilities to validate expected behaviours such as correlations between user contributions and the associated effort.
Resumo:
Security defects are common in large software systems because of their size and complexity. Although efficient development processes, testing, and maintenance policies are applied to software systems, there are still a large number of vulnerabilities that can remain, despite these measures. Some vulnerabilities stay in a system from one release to the next one because they cannot be easily reproduced through testing. These vulnerabilities endanger the security of the systems. We propose vulnerability classification and prediction frameworks based on vulnerability reproducibility. The frameworks are effective to identify the types and locations of vulnerabilities in the earlier stage, and improve the security of software in the next versions (referred to as releases). We expand an existing concept of software bug classification to vulnerability classification (easily reproducible and hard to reproduce) to develop a classification framework for differentiating between these vulnerabilities based on code fixes and textual reports. We then investigate the potential correlations between the vulnerability categories and the classical software metrics and some other runtime environmental factors of reproducibility to develop a vulnerability prediction framework. The classification and prediction frameworks help developers adopt corresponding mitigation or elimination actions and develop appropriate test cases. Also, the vulnerability prediction framework is of great help for security experts focus their effort on the top-ranked vulnerability-prone files. As a result, the frameworks decrease the number of attacks that exploit security vulnerabilities in the next versions of the software. To build the classification and prediction frameworks, different machine learning techniques (C4.5 Decision Tree, Random Forest, Logistic Regression, and Naive Bayes) are employed. The effectiveness of the proposed frameworks is assessed based on collected software security defects of Mozilla Firefox.
Resumo:
As the field of family business has grown immensely over the last couple of decades, a multitude of theories from different fields has been introduced. However, there are surprisingly few attempts to provide an overview of theories that may be of particular interest for the family business scholar. Thus, this introduction chapter gives a critical overview of theoretical perspectives, before taking a closer look at the use of theories in family business studies. Regarding the current state of the family business field, the authors argue for putting more effort in building theory from family business research as well as a stronger emphasis on ‘giving back’ to theories borrowed from other fields. Lastly, the chapter describes the development of the book and introduces the 13 chapters and their contributions.
Resumo:
To finance transportation infrastructure and to address social and environmental negative externalities of road transports, several countries have recently introduced or consider a distance based tax on trucks. In competitive retail and transportation markets, such tax can be expected to lower the demand and thereby reduce CO2 emissions of road transports. However, as we show in this paper, such tax might also slow down the transition towards e-tailing. Considering that previous research indicates that a consumer switching from brick-and-mortar shopping to e-tailing reduces her CO2 emissions substantially, the direction and magnitude of the environmental net effect of the tax is unclear. In this paper, we assess the net effect in a Swedish regional retail market where the tax not yet is in place. We predict the net effect on CO2 emissions to be positive, but off-set by about 50% because of a slower transition to e-tailing.
Resumo:
Annual counts of migrating raptors at fixed observation points are a widespread practice, and changes in numbers counted over time, adjusted for survey effort, are commonly used as indices of trends in population size. Unmodeled year-to-year variation in detectability may introduce bias, reduce precision of trend estimates, and reduce power to detect trends. We conducted dependent double-observer surveys at the annual fall raptor migration count at Lucky Peak, Idaho, in 2009 and 2010 and applied Huggins closed-capture removal models and information-theoretic model selection to determine the relative importance of factors affecting detectability. The most parsimonious model included effects of observer team identity, distance, species, and day of the season. We then simulated 30 years of counts with heterogeneous individual detectability, a population decline (λ = 0.964), and unexplained random variation in the number of available birds. Imperfect detectability did not bias trend estimation, and increased the time required to achieve 80% power by less than 11%. Results suggested that availability is a greater source of variance in annual counts than detectability; thus, efforts to account for availability would improve the monitoring value of migration counts. According to our models, long-term trends in observer efficiency or migratory flight distance may introduce substantial bias to trend estimates. Estimating detectability with a novel count protocol like our double-observer method is just one potential means of controlling such effects. The traditional approach of modeling the effects of covariates and adjusting the index may also be effective if ancillary data is collected consistently.
Resumo:
In this work, fabrication processes for daylight guiding systems based on micromirror arrays are developed, evaluated and optimized.Two different approaches are used: At first, nanoimprint lithography is used to fabricate large area micromirrors by means of Substrate Conformal Imprint Lithography (SCIL).Secondly,a new lithography technique is developed using a novel bi-layered photomask to fabricate large area micromirror arrays. The experimental results showing a reproducible stable process, high yield, and is consuming less material, time, cost and effort.
Resumo:
Audit report on the Lake Panorama Rural Improvement Zone for the year ended June 30, 2016