8 resultados para methodologies

em Digital Commons at Florida International University


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most research on tax evasion has focused on the income tax. Sales tax evasion has been largely ignored and dismissed as immaterial. This paper explored the differences between income tax and sales tax evasion and demonstrated that sales tax enforcement is deserving of and requires the use of different tools to achieve compliance. Specifically, the major enforcement problem with sales tax is not evasion: it is theft perpetrated by companies that act as collection agents for the state. Companies engage in a principal-agent relationship with the state and many retain funds collected as an agent of the state for private use. As such, the act of sales tax theft bears more resemblance to embezzlement than to income tax evasion. It has long been assumed that the sales tax is nearly evasion free, and state revenue departments report voluntary compliance in a manner that perpetuates this myth. Current sales tax compliance enforcement methodologies are similar in form to income tax compliance enforcement methodologies and are based largely on trust. The primary focus is on delinquent filers with a very small percentage of businesses subject to audit. As a result, there is a very large group of noncompliant businesses who file on time and fly below the radar while stealing millions of taxpayer dollars. ^ The author utilized a variety of statistical methods with actual field data derived from operations of the Southern Region Criminal Investigations Unit of the Florida Department of Revenue to evaluate current and proposed sales tax compliance enforcement methodologies in a quasi-experimental, time series research design and to set forth a typology of sales tax evaders. This study showed that current estimates of voluntary compliance in sales tax systems are seriously and significantly overstated and that current enforcement methodologies are inadequate to identify the majority of violators and enforce compliance. Sales tax evasion is modeled using the theory of planned behavior and Cressey’s fraud triangle and it is demonstrated that proactive enforcement activities, characterized by substantial contact with non-delinquent taxpayers, results in superior ability to identify noncompliance and provides a structure through which noncompliant businesses can be rehabilitated.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The proliferation of legalized gaming has significantly changed the nature of the hospitality industry. While several aspects of gaming have flourished, none has become more popular, profitable, or technologically advanced as the slot machine. While more than half of all casino gambling, and earnings, is generated by slot machines, little has been written about the technology integral to these devices. The author describes the workings of computer-controlled slot machines and exposes some of the popular operating myths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. ^ Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. ^ Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. ^ With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation. In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data. For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The performance of building envelopes and roofing systems significantly depends on accurate knowledge of wind loads and the response of envelope components under realistic wind conditions. Wind tunnel testing is a well-established practice to determine wind loads on structures. For small structures much larger model scales are needed than for large structures, to maintain modeling accuracy and minimize Reynolds number effects. In these circumstances the ability to obtain a large enough turbulence integral scale is usually compromised by the limited dimensions of the wind tunnel meaning that it is not possible to simulate the low frequency end of the turbulence spectrum. Such flows are called flows with Partial Turbulence Simulation.^ In this dissertation, the test procedure and scaling requirements for tests in partial turbulence simulation are discussed. A theoretical method is proposed for including the effects of low-frequency turbulences in the post-test analysis. In this theory the turbulence spectrum is divided into two distinct statistical processes, one at high frequencies which can be simulated in the wind tunnel, and one at low frequencies which can be treated in a quasi-steady manner. The joint probability of load resulting from the two processes is derived from which full-scale equivalent peak pressure coefficients can be obtained. The efficacy of the method is proved by comparing predicted data derived from tests on large-scale models of the Silsoe Cube and Texas-Tech University buildings in Wall of Wind facility at Florida International University with the available full-scale data.^ For multi-layer building envelopes such as rain-screen walls, roof pavers, and vented energy efficient walls not only peak wind loads but also their spatial gradients are important. Wind permeable roof claddings like roof pavers are not well dealt with in many existing building codes and standards. Large-scale experiments were carried out to investigate the wind loading on concrete pavers including wind blow-off tests and pressure measurements. Simplified guidelines were developed for design of loose-laid roof pavers against wind uplift. The guidelines are formatted so that use can be made of the existing information in codes and standards such as ASCE 7-10 on pressure coefficients on components and cladding.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Managed lane strategies are innovative road operation schemes for addressing congestion problems. These strategies operate a lane (lanes) adjacent to a freeway that provides congestion-free trips to eligible users, such as transit or toll-payers. To ensure the successful implementation of managed lanes, the demand on these lanes need to be accurately estimated. Among different approaches for predicting this demand, the four-step demand forecasting process is most common. Managed lane demand is usually estimated at the assignment step. Therefore, the key to reliably estimating the demand is the utilization of effective assignment modeling processes. Managed lanes are particularly effective when the road is functioning at near-capacity. Therefore, capturing variations in demand and network attributes and performance is crucial for their modeling, monitoring and operation. As a result, traditional modeling approaches, such as those used in static traffic assignment of demand forecasting models, fail to correctly predict the managed lane demand and the associated system performance. The present study demonstrates the power of the more advanced modeling approach of dynamic traffic assignment (DTA), as well as the shortcomings of conventional approaches, when used to model managed lanes in congested environments. In addition, the study develops processes to support an effective utilization of DTA to model managed lane operations. Static and dynamic traffic assignments consist of demand, network, and route choice model components that need to be calibrated. These components interact with each other, and an iterative method for calibrating them is needed. In this study, an effective standalone framework that combines static demand estimation and dynamic traffic assignment has been developed to replicate real-world traffic conditions. With advances in traffic surveillance technologies collecting, archiving, and analyzing traffic data is becoming more accessible and affordable. The present study shows how data from multiple sources can be integrated, validated, and best used in different stages of modeling and calibration of managed lanes. Extensive and careful processing of demand, traffic, and toll data, as well as proper definition of performance measures, result in a calibrated and stable model, which closely replicates real-world congestion patterns, and can reasonably respond to perturbations in network and demand properties.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.