948 resultados para statistical methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Most research on tax evasion has focused on the income tax. Sales tax evasion has been largely ignored and dismissed as immaterial. This paper explored the differences between income tax and sales tax evasion and demonstrated that sales tax enforcement is deserving of and requires the use of different tools to achieve compliance. Specifically, the major enforcement problem with sales tax is not evasion: it is theft perpetrated by companies that act as collection agents for the state. Companies engage in a principal-agent relationship with the state and many retain funds collected as an agent of the state for private use. As such, the act of sales tax theft bears more resemblance to embezzlement than to income tax evasion. It has long been assumed that the sales tax is nearly evasion free, and state revenue departments report voluntary compliance in a manner that perpetuates this myth. Current sales tax compliance enforcement methodologies are similar in form to income tax compliance enforcement methodologies and are based largely on trust. The primary focus is on delinquent filers with a very small percentage of businesses subject to audit. As a result, there is a very large group of noncompliant businesses who file on time and fly below the radar while stealing millions of taxpayer dollars. ^ The author utilized a variety of statistical methods with actual field data derived from operations of the Southern Region Criminal Investigations Unit of the Florida Department of Revenue to evaluate current and proposed sales tax compliance enforcement methodologies in a quasi-experimental, time series research design and to set forth a typology of sales tax evaders. This study showed that current estimates of voluntary compliance in sales tax systems are seriously and significantly overstated and that current enforcement methodologies are inadequate to identify the majority of violators and enforce compliance. Sales tax evasion is modeled using the theory of planned behavior and Cressey’s fraud triangle and it is demonstrated that proactive enforcement activities, characterized by substantial contact with non-delinquent taxpayers, results in superior ability to identify noncompliance and provides a structure through which noncompliant businesses can be rehabilitated.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mexico harbors more than 10% of the planet’s endemic species. However, the integrity and biodiversity of many ecosystems is experiencing rapid transformation under the influence of a wide array of human and natural disturbances. In order to disentangle the effects of human and natural disturbance regimes at different spatial and temporal scales, we selected six terrestrial (temperate montane forests, montane cloud forests, tropical rain forests, tropical semi-deciduous forests, tropical dry forests, and deserts) and four aquatic (coral reefs, mangrove forests, kelp forests and saline lakes) ecosystems. We used semiquantitative statistical methods to assess (1) the most important agents of disturbance affecting the ecosystems, (2) the vulnerability of each ecosystem to anthropogenic and natural disturbance, and (3) the differences in ecosystem disturbance regimes and their resilience. Our analysis indicates a significant variation in ecological responses, recovery capacity, and resilience among ecosystems. The constant and widespread presence of human impacts on both terrestrial and aquatic ecosystems is reflected either in reduced area coverage for most systems, or reduced productivity and biodiversity, particularly in the case of fragile ecosystems (e.g., rain forests, coral reefs). In all cases, the interaction between historical human impacts and episodic high intensity natural disturbance (e.g., hurricanes, fires) has triggered a reduction in species diversity and induced significant changes in habitat distribution or species dominance. The lack of monitoring programs assessing before/after effects of major disturbances in Mexico is one of the major limitations to quantifying the commonalities and differences of disturbance effects on ecosystem properties.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examined two competitive hypotheses: the double jeopardy hypothesis and the buffering effect hypothesis on whether parental divorce affects adopted children and non-adopted children similarly or differently. The double jeopardy hypothesis suggests that when adopted children experience their parents' divorce, they perform worse because they carry two risk factors, adoption status and parental divorce, while their non-adopted counterparts carry only the risk factor of their parents' divorce. The buffering effect hypothesis suggests that, being adopted children, their previous experiences of parental loss help them better deal with the later loss of their parents' divorce so their adoption status is a protective factor rather than a risk factor. ^ Secondary analyses of a nation-wide data set were executed using different statistical methods such as ANOVA and Chi-square on different outcome variables. The results indicated that there was no evidence supporting the double-jeopardy hypothesis. That is, adopted children from divorced families did not perform significantly worse than the non-adopted children from divorced families on any outcome variable. The results also indicated that there was only weak evidence supporting the buffering effect hypothesis. The general conclusion based on the results from most of the outcome variables suggest that adopted children from divorced families do not perform differently than the biological children from divorced families. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the wake of the “9-11” terrorists' attacks, the U.S. Government has turned to information technology (IT) to address a lack of information sharing among law enforcement agencies. This research determined if and how information-sharing technology helps law enforcement by examining the differences in perception of the value of IT between law enforcement officers who have access to automated regional information sharing and those who do not. It also examined the effect of potential intervening variables such as user characteristics, training, and experience, on the officers' evaluation of IT. The sample was limited to 588 officers from two sheriff's offices; one of them (the study group) uses information sharing technology, the other (the comparison group) does not. Triangulated methodologies included surveys, interviews, direct observation, and a review of agency records. Data analysis involved the following statistical methods: descriptive statistics, Chi-Square, factor analysis, principal component analysis, Cronbach's Alpha, Mann-Whitney tests, analysis of variance (ANOVA), and Scheffe' post hoc analysis. ^ Results indicated a significant difference between groups: the study group perceived information sharing technology as being a greater factor in solving crime and in increasing officer productivity. The study group was more satisfied with the data available to it. As to the number of arrests made, information sharing technology did not make a difference. Analysis of the potential intervening variables revealed several remarkable results. The presence of a strong performance management imperative (in the comparison sheriff's office) appeared to be a factor in case clearances and arrests, technology notwithstanding. As to the influence of user characteristics, level of education did not influence a user's satisfaction with technology, but user-satisfaction scores differed significantly among years of experience as a law enforcement officer and the amount of computer training, suggesting a significant but weak relationship. ^ Therefore, this study finds that information sharing technology assists law enforcement officers in doing their jobs. It also suggests that other variables such as computer training, experience, and management climate should be accounted for when assessing the impact of information technology. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study is to identify research trends in Merger and Acquisition waves in the restaurant industry and propose future research directions by thoroughly reviewing existing Merger and Acquisition related literature. Merger and Acquisition has been extensively used as a strategic management tool for fast growth in the restaurant industry. However, there has been a very limited amount of literature that focuses on Merger & Acquisition in the restaurant industry. Particular, no known study has been identified that examined M&A wave and its determinants. A good understanding of determinants of M&A wave will help practitioners identify important factors that should be considered before making M&A decisions and predict the optimal timing for successful M&A transactions. This study examined literature on six U.S M&A waves and their determinants and summarized main explanatory factors examined, statistical methods, and theoretical frameworks. Inclusion of unique macroeconomic factors of the restaurant industry and the use of factor analysis are suggested for future research.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multiple linear regression model plays a key role in statistical inference and it has extensive applications in business, environmental, physical and social sciences. Multicollinearity has been a considerable problem in multiple regression analysis. When the regressor variables are multicollinear, it becomes difficult to make precise statistical inferences about the regression coefficients. There are some statistical methods that can be used, which are discussed in this thesis are ridge regression, Liu, two parameter biased and LASSO estimators. Firstly, an analytical comparison on the basis of risk was made among ridge, Liu and LASSO estimators under orthonormal regression model. I found that LASSO dominates least squares, ridge and Liu estimators over a significant portion of the parameter space for large dimension. Secondly, a simulation study was conducted to compare performance of ridge, Liu and two parameter biased estimator by their mean squared error criterion. I found that two parameter biased estimator performs better than its corresponding ridge regression estimator. Overall, Liu estimator performs better than both ridge and two parameter biased estimator.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research aimed to analyse the effect of different territorial divisions in the random fluctuation of socio-economic indicators related to social determinants of health. This is an ecological study resulting from a combination of statistical methods including individuated and aggregate data analysis, using five databases derived from the database of the Brazilian demographic census 2010: overall results of the sample by weighting area. These data were grouped into the following levels: households; weighting areas; cities; Immediate Urban Associated Regions and Intermediate Urban Associated Regions. A theoretical model related to social determinants of health was used, with the dependent variable Household with death and as independent variables: Black race; Income; Childcare and school no attendance; Illiteracy; and Low schooling. The data was analysed in a model related to social determinants of health, using Poisson regression in individual basis, multilevel Poisson regression and multiple linear regression in light of the theoretical framework of the area. It was identified a greater proportion of households with deaths among those with at least one black resident, lower-income, illiterate, who do not attend or attended school or day-care and less educated. The analysis of the adjusted model showed that most adjusted prevalence ratio was related to Income, where there is a risk value of 1.33 for households with at least one resident with lower average personal income to R$ 655,00 (Brazilian current). The multilevel analysis demonstrated that there was a context effect when the variables were subjected to the effects of areas, insofar as the random effects were significant for all models and with different prevalence rates being higher in the areas with smaller dimensions - Weighting areas with coefficient of 0.035 and Cities with coefficient of 0.024. The ecological analyses have shown that the variable Income and Low schooling presented explanatory potential for the outcome on all models, having income greater power to determine the household deaths, especially in models related to Immediate Urban Associated Regions with a standardized coefficient of -0.616 and regions intermediate urban associated regions with a standardized coefficient of -0.618. It was concluded that there was a context effect on the random fluctuation of the socioeconomic indicators related to social determinants of health. This effect was explained by the characteristics of territorial divisions and individuals who live or work there. Context effects were better identified in the areas with smaller dimensions, which are more favourable to explain phenomena related to social determinants of health, especially in studies of societies marked by social inequalities. The composition effects were better identified in the Regions of Urban Articulation, shaped through mechanisms similar to the phenomenon under study.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During the 2007-2008 austral spring season, the ANDRILL (Antarctic Drilling project) Southern McMurdo Sound Project recovered an 1138-m-long core, representing the last 20 m.y. of glacial history. An extensive downhole logging program was successfully carried out. Due to drill hole conditions, logs were collected in several passes from the total depth at 1138.54 m below seafloor (mbsf) to 230 mbsf. After data correction, several statistical methods, such as factor analysis, cluster analysis, box-and-whisker diagrams, and cross-plots, were applied. The aim of these analyses was to use detailed interpretation of the downhole logs to obtain a description of the lithologies and their specific physical properties that is independent of the core descriptions. The sediments were grouped into the three main facies, diamictite, mudstone and/or siltstone, and sandstone, and the physical properties of each were determined. Notable findings include the high natural radioactivity values in sandstone and the high and low magnetic susceptibility values in mudstone and/or siltstone and in sandstone. A modified lithology cluster column was produced on the basis of the downhole logs and statistical analyses. It was possible to use the uranium content in the downhole logs to determine hiatuses and thus more accurately place the estimated hiatuses. Using analyses from current literature (geochemistry, clasts, and clay minerals) in combination with the downhole logs (cluster analysis), the depths 225 mbsf, 650 mbsf, 775 mbsf, and 900 mbsf were identified as boundaries of change in sediment composition, provenance, and/or environmental conditions. The main use of log interpretation is the exact definition of lithological boundaries and the modification of the paleoenvironmental interpretation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study subdivides the Weddell Sea, Antarctica, into seafloor regions using multivariate statistical methods. These regions are categories used for comparing, contrasting and quantifying biogeochemical processes and biodiversity between ocean regions geographically but also regions under development within the scope of global change. The division obtained is characterized by the dominating components and interpreted in terms of ruling environmental conditions. The analysis uses 28 environmental variables for the sea surface, 25 variables for the seabed and 9 variables for the analysis between surface and bottom variables. The data were taken during the years 1983-2013. Some data were interpolated. The statistical errors of several interpolation methods (e.g. IDW, Indicator, Ordinary and Co-Kriging) with changing settings have been compared for the identification of the most reasonable method. The multivariate mathematical procedures used are regionalized classification via k means cluster analysis, canonical-correlation analysis and multidimensional scaling. Canonical-correlation analysis identifies the influencing factors in the different parts of the cove. Several methods for the identification of the optimum number of clusters have been tested. For the seabed 8 and 12 clusters were identified as reasonable numbers for clustering the Weddell Sea. For the sea surface the numbers 8 and 13 and for the top/bottom analysis 8 and 3 were identified, respectively. Additionally, the results of 20 clusters are presented for the three alternatives offering the first small scale environmental regionalization of the Weddell Sea. Especially the results of 12 clusters identify marine-influenced regions which can be clearly separated from those determined by the geological catchment area and the ones dominated by river discharge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Based on the quantitative study of diatoms and radiolarians, summer sea-surface temperature (SSST) and sea ice distribution were estimated from 122 sediment core localities in the Atlantic, Indian and Pacific sectors of the Southern Ocean to reconstruct the last glacial environment at the EPILOG (19.5-16.0 ka or 23 000-19 000 cal yr. B.P.) time-slice. The statistical methods applied include the Imbrie and Kipp Method, the Modern Analog Technique and the General Additive Model. Summer SSTs reveal greater surface-water cooling than reconstructed by CLIMAP (Geol. Soc. Am. Map Chart. Ser. MC-36 (1981) 1), reaching a maximum (4-5 °C) in the present Subantarctic Zone of the Atlantic and Indian sector. The reconstruction of maximum winter sea ice (WSI) extent is in accordance with CLIMAP, showing an expansion of the WSI field by around 100% compared to the present. Although only limited information is available, the data clearly show that CLIMAP strongly overestimated the glacial summer sea ice extent. As a result of the northward expansion of Antarctic cold waters by 5-10° in latitude and a relatively small displacement of the Subtropical Front, thermal gradients were steepened during the last glacial in the northern zone of the Southern Ocean. Such reconstruction may, however, be inapposite for the Pacific sector. The few data available indicate reduced cooling in the southern Pacific and give suggestion for a non-uniform cooling of the glacial Southern Ocean.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation investigates, based on the Post-Keynesian theory and on its concept of monetary economy of production, the exchange rate behavior of the Brazilian Real in the presence of Brazilian Central Bank's interventions by means of the so-called swap transactions over 2002-2015. Initially, the work analyzes the essential properties of an open monetary economy of production and, thereafter, it presents the basic propositions of the Post-Keynesian view on the exchange rate determination, highlighting the properties of foreign exchange markets and the peculiarities of the Brazilian position into the international monetary and financial system. The research, thereby, accounts for the various segments of the Brazilian foreign exchange market. To accomplish its purpose, we first do a literature review of the Post-Keynesian literature about the topic. Then, we undertake empirical exams of the exchange rate determination using two statistical methods. On the one hand, to measure the volatility of exchange rate, we estimate Auto-regressive Conditional Heteroscedastic (ARCH) and Generalized Auto-regressive Conditional Heteroscedastic (GARCH) models. On the other hand, to measure the variance of the exchange rate in relation to real, financial variables, and the swaps, we estimate a Vector Auto-regression (VAR) model. Both experiments are performed for the nominal and real effective exchange rates. The results show that the swaps respond to exchange rate movements, trying to offset its volatility. This reveals that the exchange rate is, at least in a certain magnitude, sensitive to swaps transactions conducted by the Central Bank. In addition, another empirical result is that the real effective exchange rate responds more to the swaps auctions than the nominal rate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We would like to thank all NHS Consultant Colleagues at Aberdeen Royal Infirmary for their help with prompt recruitment of these patients (Dr M Metcalfe, Dr AD Stewart, Dr A Hannah, Dr A Noman, Dr P Broadhurst, Dr D Hogg, Dr D Garg) and to Dr Gordon Prescott for help and advice with the Statistical Methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2004, the National Institutes of Health made available the Patient-Reported Outcomes Measurement Information System – PROMIS®, which is constituted of innovative item banks for health assessment. It is based on classical, reliable Patient-Reported Outcomes (PROs) and includes advanced statistical methods, such as Item Response Theory and Computerized Adaptive Test. One of PROMIS® Domain Frameworks is the Physical Function, whose item bank need to be translated and culturally adapted so it can be used in Portuguese speaking countries. This work aimed to translate and culturally adapt the PROMIS® Physical Function item bank into Portuguese. FACIT (Functional Assessment of Chronic Illness Therapy) translation methodology, which is constituted of eight stages for translation and cultural adaptation, was used. Fifty subjects above the age of 18 years participated in the pre-test (seventh stage). The questionnaire was answered by the participants (self-reported questionnaires) by using think aloud protocol, and cognitive and retrospective interviews. In FACIT methodology, adaptations can be done since the beginning of the translation and cultural adaption process, ensuring semantic, conceptual, cultural, and operational equivalences of the Physical Function Domain. During the pre-test, 24% of the subjects had difficulties understanding the items, 22% of the subjects suggested changes to improve understanding. The terms and concepts of the items were totally understood (100%) in 87% of the items. Only four items had less than 80% of understanding; for this reason, it was necessary to chance them so they could have correspondence with the original item and be understood by the subjects, after retesting. The process of translation and cultural adaptation of the PROMIS® Physical Function item bank into Portuguese was successful. This version of the assessment tool must have its psychometric properties validated before being made available for clinical use.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.

The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.

Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.

Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.

The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dissertation consists of three chapters related to the low-price guarantee marketing strategy and energy efficiency analysis. The low-price guarantee is a marketing strategy in which firms promise to charge consumers the lowest price among their competitors. Chapter 1 addresses the research question "Does a Low-Price Guarantee Induce Lower Prices'' by looking into the retail gasoline industry in Quebec where there was a major branded firm which started a low-price guarantee back in 1996. Chapter 2 does a consumer welfare analysis of low-price guarantees to drive police indications and offers a new explanation of the firms' incentives to adopt a low-price guarantee. Chapter 3 develops the energy performance indicators (EPIs) to measure energy efficiency of the manufacturing plants in pulp, paper and paperboard industry.

Chapter 1 revisits the traditional view that a low-price guarantee results in higher prices by facilitating collusion. Using accurate market definitions and station-level data from the retail gasoline industry in Quebec, I conducted a descriptive analysis based on stations and price zones to compare the price and sales movement before and after the guarantee was adopted. I find that, contrary to the traditional view, the stores that offered the guarantee significantly decreased their prices and increased their sales. I also build a difference-in-difference model to quantify the decrease in posted price of the stores that offered the guarantee to be 0.7 cents per liter. While this change is significant, I do not find the response in comeptitors' prices to be significant. The sales of the stores that offered the guarantee increased significantly while the competitors' sales decreased significantly. However, the significance vanishes if I use the station clustered standard errors. Comparing my observations and the predictions of different theories of modeling low-price guarantees, I conclude the empirical evidence here supports that the low-price guarantee is a simple commitment device and induces lower prices.

Chapter 2 conducts a consumer welfare analysis of low-price guarantees to address the antitrust concerns and potential regulations from the government; explains the firms' potential incentives to adopt a low-price guarantee. Using station-level data from the retail gasoline industry in Quebec, I estimated consumers' demand of gasoline by a structural model with spatial competition incorporating the low-price guarantee as a commitment device, which allows firms to pre-commit to charge the lowest price among their competitors. The counterfactual analysis under the Bertrand competition setting shows that the stores that offered the guarantee attracted a lot more consumers and decreased their posted price by 0.6 cents per liter. Although the matching stores suffered a decrease in profits from gasoline sales, they are incentivized to adopt the low-price guarantee to attract more consumers to visit the store likely increasing profits at attached convenience stores. Firms have strong incentives to adopt a low-price guarantee on the product that their consumers are most price-sensitive about, while earning a profit from the products that are not covered in the guarantee. I estimate that consumers earn about 0.3% more surplus when the low-price guarantee is in place, which suggests that the authorities should not be concerned and regulate low-price guarantees. In Appendix B, I also propose an empirical model to look into how low-price guarantees would change consumer search behavior and whether consumer search plays an important role in estimating consumer surplus accurately.

Chapter 3, joint with Gale Boyd, describes work with the pulp, paper, and paperboard (PP&PB) industry to provide a plant-level indicator of energy efficiency for facilities that produce various types of paper products in the United States. Organizations that implement strategic energy management programs undertake a set of activities that, if carried out properly, have the potential to deliver sustained energy savings. Energy performance benchmarking is a key activity of strategic energy management and one way to enable companies to set energy efficiency targets for manufacturing facilities. The opportunity to assess plant energy performance through a comparison with similar plants in its industry is a highly desirable and strategic method of benchmarking for industrial energy managers. However, access to energy performance data for conducting industry benchmarking is usually unavailable to most industrial energy managers. The U.S. Environmental Protection Agency (EPA), through its ENERGY STAR program, seeks to overcome this barrier through the development of manufacturing sector-based plant energy performance indicators (EPIs) that encourage U.S. industries to use energy more efficiently. In the development of the energy performance indicator tools, consideration is given to the role that performance-based indicators play in motivating change; the steps necessary for indicator development, from interacting with an industry in securing adequate data for the indicator; and actual application and use of an indicator when complete. How indicators are employed in EPA’s efforts to encourage industries to voluntarily improve their use of energy is discussed as well. The chapter describes the data and statistical methods used to construct the EPI for plants within selected segments of the pulp, paper, and paperboard industry: specifically pulp mills and integrated paper & paperboard mills. The individual equations are presented, as are the instructions for using those equations as implemented in an associated Microsoft Excel-based spreadsheet tool.