934 resultados para on-time-delivery
Sales tax enforcement: An empirical analysis of compliance enforcement methodologies and pathologies
Resumo:
Most research on tax evasion has focused on the income tax. Sales tax evasion has been largely ignored and dismissed as immaterial. This paper explored the differences between income tax and sales tax evasion and demonstrated that sales tax enforcement is deserving of and requires the use of different tools to achieve compliance. Specifically, the major enforcement problem with sales tax is not evasion: it is theft perpetrated by companies that act as collection agents for the state. Companies engage in a principal-agent relationship with the state and many retain funds collected as an agent of the state for private use. As such, the act of sales tax theft bears more resemblance to embezzlement than to income tax evasion. It has long been assumed that the sales tax is nearly evasion free, and state revenue departments report voluntary compliance in a manner that perpetuates this myth. Current sales tax compliance enforcement methodologies are similar in form to income tax compliance enforcement methodologies and are based largely on trust. The primary focus is on delinquent filers with a very small percentage of businesses subject to audit. As a result, there is a very large group of noncompliant businesses who file on time and fly below the radar while stealing millions of taxpayer dollars. ^ The author utilized a variety of statistical methods with actual field data derived from operations of the Southern Region Criminal Investigations Unit of the Florida Department of Revenue to evaluate current and proposed sales tax compliance enforcement methodologies in a quasi-experimental, time series research design and to set forth a typology of sales tax evaders. This study showed that current estimates of voluntary compliance in sales tax systems are seriously and significantly overstated and that current enforcement methodologies are inadequate to identify the majority of violators and enforce compliance. Sales tax evasion is modeled using the theory of planned behavior and Cressey’s fraud triangle and it is demonstrated that proactive enforcement activities, characterized by substantial contact with non-delinquent taxpayers, results in superior ability to identify noncompliance and provides a structure through which noncompliant businesses can be rehabilitated.^
Resumo:
An integrated surface-subsurface hydrological model of Everglades National Park (ENP) was developed using MIKE SHE and MIKE 11 modeling software. The model has a resolution of 400 meters, covers approximately 1050 square miles of ENP, includes 110 miles of drainage canals with a variety of hydraulic structures, and processes hydrological information, such as evapotranspiration, precipitation, groundwater levels, canal discharges and levels, and operational schedules. Calibration was based on time series and probability of exceedance for water levels and discharges in the years 1987 through 1997. Model verification was then completed for the period of 1998 through 2005. Parameter sensitivity in uncertainty analysis showed that the model was most sensitive to the hydraulic conductivity of the regional Surficial Aquifer System, the Manning's roughness coefficient, and the leakage coefficient, which defines the canal-subsurface interaction. The model offers an enhanced predictive capability, compared to other models currently available, to simulate the flow regime in ENP and to forecast the impact of topography, water flows, and modifying operation schedules.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
The DTRF2014 is a realization of the the fundamental Earth-fixed coordinate system, the International Terrestrial Reference System (ITRS). It has been computed by the Deutsches Geodätisches Forschungsinstitut der Technischen Universität München (DGFI-TUM). The DTRF2014 consists of station positions and velocities of 1712 globally distributed geodetic observing stations of the observation techniques VLBI, SLR, GNSS and DORIS. Additionally, for the first time, non-tidal atmospheric and hydrological loading is considered in the solution. The DTRF2014 was released in August 2016 and incorporates observation data of the four techniques up 2014. The observation data were processed and submitted by the corresponding technique services: IGS (International GNSS Service, http://igscb.jpl.nasa.gov) IVS (International VLBI Service, http://ivscc.gsfc.nasa.gov) ILRS (International Laser Ranging Service, http://ilrs.gsfc.nasa.gov) IDS (International DORIS Service, http://ids-doris.org). The DTRF2014 is an independent ITRS realization. It is computed on the basis of the same input data as the realizations JTRF2014 (JPL, Pasadena) and ITRF2014 (IGN, Paris). The three realizations of the ITRS differ conceptually. While DTRF2014 and ITRF2014 are based on station positions at a reference epoch and velocities, the JTRF2014 is based on time series of station positions. DTRF2014 and ITRF2014 result from different combination strategies: The ITRF2014 is based on the combination of solutions, the DTRF2014 is computed by the combination of normal equations. The DTRF2014 comprises 3D coordinates and coordinate changes of 1347 GNSS-, 113 VLBI-, 99 SLR- and 153 DORIS-stations. The reference epoch is 1.1.2005, 0h UTC. The Earth Orientation Parameters (EOP) - that means the coordinates of the terrestrial and the celestial pole, UT1-UTC and the Length of Day (LOD) - were simultaneously estimated with the station coordinates. The EOP time series cover the period from 1979.7 to 2015.0. The station names are the official IERS identifiers: CDP numbers or 4-character IDs and DOMES numbers (http://itrf.ensg.ign.fr/doc_ITRF/iers_sta_list.txt). The DTRF2014 solution is available in one comprehensive SINEX file and four technique-specific SINEX files, see below. A detailed description of the solution is given on the website of DGFI-TUM (http://www.dgfi.tum.de/en/science-data-products/dtrf2014/). More information can be made available by request.
Resumo:
The great amount of data generated as the result of the automation and process supervision in industry implies in two problems: a big demand of storage in discs and the difficulty in streaming this data through a telecommunications link. The lossy data compression algorithms were born in the 90’s with the goal of solving these problems and, by consequence, industries started to use those algorithms in industrial supervision systems to compress data in real time. These algorithms were projected to eliminate redundant and undesired information in a efficient and simple way. However, those algorithms parameters must be set for each process variable, becoming impracticable to configure this parameters for each variable in case of systems that monitor thousands of them. In that context, this paper propose the algorithm Adaptive Swinging Door Trending that consists in a adaptation of the Swinging Door Trending, as this main parameters are adjusted dynamically by the analysis of the signal tendencies in real time. It’s also proposed a comparative analysis of performance in lossy data compression algorithms applied on time series process variables and dynamometer cards. The algorithms used to compare were the piecewise linear and the transforms.
Resumo:
The thermodynamic performance of a refrigeration system can be improved by reducing the compression work by a particular technique for a specific heat removal rate. This study examines the effect of small concentrations of Al2O3 (50 nm) nanoparticles dispersion in the mineral oil based lubricant on the: viscosity, thermal conductivity, and lubrication characteristics as well as the overall performance (based on the Second Law of Thermodynamics) of the refrigerating system using R134a or R600a as refrigerants. The study looked at the influences of variables: i) refrigerant charge (100, 110, 120 and 130 g), ii) rotational speed of the condenser blower (800 and 1100 RPM) and iii) nanoparticle concentration (0.1 and 0.5 g/l) on the system performance based on the Taguchi method in a matrix of L8 trials with the criterion "small irreversibility is better”. They were carried pulldown and cycling tests according to NBR 12866 and NBR 12869, respectively, to evaluate the operational parameters: on-time ratio, cycles per hour, suction and discharge pressures, oil sump temperature, evaporation and condensation temperatures, energy consumption at the set-point, total energy consumption and compressor power. In order to evaluate the nanolubricant characteristics, accelerated tests were performed in a HFRR bench. In each 60 minutes test with nanolubricants at a certain concentration (0, 0.1 and 0.5 g/l), with three replications, the sphere (diameter 6.00 ± 0.05 mm, Ra 0.05 ± 0.005 um, AISI 52100 steel, E = 210 GPa, HRC 62 ± 4) sliding on a flat plate (cast iron FC200, Ra <0.5 ± 0.005 um) in a reciprocating motion with amplitude of 1 mm, frequency 20 Hz and a normal load of 1,96 N. The friction coefficient signals were recorded by sensors coupled to the HFRR system. There was a trend commented bit in the literature: a nanolubricant viscosity reduction at the low nanoparticles concentrations. It was found the dominant trend in the literature: increased thermal conductivity with increasing nanoparticles mass fraction in the base fluid. Another fact observed is the significant thermal conductivity growth of nanolubricant with increasing temperature. The condenser fan rotational speed is the most influential parameter (46.192%) in the refrigerator performance, followed by R600a charge (38.606%). The Al2O3 nanoparticles concentration in the lubricant plays a minor influence on system performance, with 12.44%. The results of power consumption indicates that the nanoparticles addition in the lubricant (0.1 g/L), together with R600a, the refrigerator consumption is reduced of 22% with respect to R134a and POE lubricant. Only the Al2O3 nanoparticles addition in the lubricant results in a consumption reduction of about 5%.
Resumo:
The thermodynamic performance of a refrigeration system can be improved by reducing the compression work by a particular technique for a specific heat removal rate. This study examines the effect of small concentrations of Al2O3 (50 nm) nanoparticles dispersion in the mineral oil based lubricant on the: viscosity, thermal conductivity, and lubrication characteristics as well as the overall performance (based on the Second Law of Thermodynamics) of the refrigerating system using R134a or R600a as refrigerants. The study looked at the influences of variables: i) refrigerant charge (100, 110, 120 and 130 g), ii) rotational speed of the condenser blower (800 and 1100 RPM) and iii) nanoparticle concentration (0.1 and 0.5 g/l) on the system performance based on the Taguchi method in a matrix of L8 trials with the criterion "small irreversibility is better”. They were carried pulldown and cycling tests according to NBR 12866 and NBR 12869, respectively, to evaluate the operational parameters: on-time ratio, cycles per hour, suction and discharge pressures, oil sump temperature, evaporation and condensation temperatures, energy consumption at the set-point, total energy consumption and compressor power. In order to evaluate the nanolubricant characteristics, accelerated tests were performed in a HFRR bench. In each 60 minutes test with nanolubricants at a certain concentration (0, 0.1 and 0.5 g/l), with three replications, the sphere (diameter 6.00 ± 0.05 mm, Ra 0.05 ± 0.005 um, AISI 52100 steel, E = 210 GPa, HRC 62 ± 4) sliding on a flat plate (cast iron FC200, Ra <0.5 ± 0.005 um) in a reciprocating motion with amplitude of 1 mm, frequency 20 Hz and a normal load of 1,96 N. The friction coefficient signals were recorded by sensors coupled to the HFRR system. There was a trend commented bit in the literature: a nanolubricant viscosity reduction at the low nanoparticles concentrations. It was found the dominant trend in the literature: increased thermal conductivity with increasing nanoparticles mass fraction in the base fluid. Another fact observed is the significant thermal conductivity growth of nanolubricant with increasing temperature. The condenser fan rotational speed is the most influential parameter (46.192%) in the refrigerator performance, followed by R600a charge (38.606%). The Al2O3 nanoparticles concentration in the lubricant plays a minor influence on system performance, with 12.44%. The results of power consumption indicates that the nanoparticles addition in the lubricant (0.1 g/L), together with R600a, the refrigerator consumption is reduced of 22% with respect to R134a and POE lubricant. Only the Al2O3 nanoparticles addition in the lubricant results in a consumption reduction of about 5%.
Resumo:
This research is a study that deals with the language of the players bodies of Ciranda (a typical dance in circle in Brazil) – more specifically the one of Lia from Itamaracá. Our interest is to observe how this body dances, communicates, writes on time and space, establishing relations that complement and help to remain in construction. Thus, in a circular way and in an energy that is transmitted with the contact of the touch of the hands, in the power of song, in a circle that can be seen from many places, but by different angles, holding on that the particularities of its subjects players/dancers/observers, and that we propose ourselves to think: who are those players bodies and how do they build the circles of Ciranda? Therefore, during the pathway of the research, we were conducted by the phenomenological method and, from this, used the concepts of lived and sensible world. Our interest in this manifestation is, also, the body that dances and insert itself in the artistic expression, meaning and opening itself to the knowledge by the experience. Therefore, we assume a conception of the body that refers itself in the merleaupontyana fenomenologic approach, in this way, in its criativity in relation to the body as a fragmented being, as it is pointed by the Cartesiana theory. In this perspective, we understand the body in its relations with the culturals, sociais, economics and artistics issues that integrates it, in others words, in the relations that helped us to better understand the body as it is. This way, this research has as main objective to present the refletions about the players body, mainly, of Lia from Itamaracá and with this body to dance, to communicate, to write itself on time and space, estabilishing relations that complement and help to stay in construction. Such a statement leads us in this work, to inquire, for example: what mobilizes those subjects on this dance? We understand that elements as the place that is always in modification, the costumes, the musicality and the change in the movimentation of the players body in each new circle, those elements are factors that activate a permanente reconfiguration that are happening in the Ciranda dance nowadays. Finally, we assert that this investigation comes in reason to the big dimension that the Ciranda has achivied in Brazil, especially in the Northeast, as the existence of few references and registers of the reseach manifestation in the academic areas. It is possible to verify, in this research, that, in reason to this spreading, the nuances of the players bodies is even more diversified and that the missing of experiences on Itamaracá island - PE, its origin has putting away the original and community, becames, more and more, a dance of others stages and squares.
Resumo:
Projects, as an organizing principle, can provide exciting contexts for innovative work. Thus far, project management discourse has tended to privilege the vital need to deliver projects ‘on time, on budget, and to specification’. In common with the call for papers for this workshop we suggest that perhaps the “instrumental rationality” underpinning this language of characterising project activity may create more problems than it solves. In this paper we suggest that such questions (and language) frame project contexts in a partial way. We argue that such concerns stem from a particular worldview or ontology, which we identify as a ‘being’ ontology. Here we contrast being and becoming project ontologies, to explore the questions, methods and interventions that each foregrounds. In an attempt to move this dialogue further than simply another contrast of modern and postmodernist accounts of project organising, we go on to consider some possible ethical concomitants of valuing being and becoming ontologies in project contexts.
Resumo:
Brain injury due to lack of oxygen or impaired blood flow around the time of birth, may cause long term neurological dysfunction or death in severe cases. The treatments need to be initiated as soon as possible and tailored according to the nature of the injury to achieve best outcomes. The Electroencephalogram (EEG) currently provides the best insight into neurological activities. However, its interpretation presents formidable challenge for the neurophsiologists. Moreover, such expertise is not widely available particularly around the clock in a typical busy Neonatal Intensive Care Unit (NICU). Therefore, an automated computerized system for detecting and grading the severity of brain injuries could be of great help for medical staff to diagnose and then initiate on-time treatments. In this study, automated systems for detection of neonatal seizures and grading the severity of Hypoxic-Ischemic Encephalopathy (HIE) using EEG and Heart Rate (HR) signals are presented. It is well known that there is a lot of contextual and temporal information present in the EEG and HR signals if examined at longer time scale. The systems developed in the past, exploited this information either at very early stage of the system without any intelligent block or at very later stage where presence of such information is much reduced. This work has particularly focused on the development of a system that can incorporate the contextual information at the middle (classifier) level. This is achieved by using dynamic classifiers that are able to process the sequences of feature vectors rather than only one feature vector at a time.
Resumo:
Computer networks produce tremendous amounts of event-based data that can be collected and managed to support an increasing number of new classes of pervasive applications. Examples of such applications are network monitoring and crisis management. Although the problem of distributed event-based management has been addressed in the non-pervasive settings such as the Internet, the domain of pervasive networks has its own characteristics that make these results non-applicable. Many of these applications are based on time-series data that possess the form of time-ordered series of events. Such applications also embody the need to handle large volumes of unexpected events, often modified on-the-fly, containing conflicting information, and dealing with rapidly changing contexts while producing results with low-latency. Correlating events across contextual dimensions holds the key to expanding the capabilities and improving the performance of these applications. This dissertation addresses this critical challenge. It establishes an effective scheme for complex-event semantic correlation. The scheme examines epistemic uncertainty in computer networks by fusing event synchronization concepts with belief theory. Because of the distributed nature of the event detection, time-delays are considered. Events are no longer instantaneous, but duration is associated with them. Existing algorithms for synchronizing time are split into two classes, one of which is asserted to provide a faster means for converging time and hence better suited for pervasive network management. Besides the temporal dimension, the scheme considers imprecision and uncertainty when an event is detected. A belief value is therefore associated with the semantics and the detection of composite events. This belief value is generated by a consensus among participating entities in a computer network. The scheme taps into in-network processing capabilities of pervasive computer networks and can withstand missing or conflicting information gathered from multiple participating entities. Thus, this dissertation advances knowledge in the field of network management by facilitating the full utilization of characteristics offered by pervasive, distributed and wireless technologies in contemporary and future computer networks.
Resumo:
Thirty-four sediment and mudline temperatures were collected from six drill holes on ODP Leg 110 near the toe of the Barbados accretionary complex. When combined with thermal conductivity measurements these data delineate the complicated thermal structure on the edge of this convergent margin. Surface heat-flow values from Leg 110 (calculated from geothermal gradients forced through the bottom-water temperature at mudline) of 92 to 192 mW/m**2 are 80% to 300% higher than values predicted by standard heat flow vs. age models for oceanic crust, but are compatible with earlier surface measurements made at the same latitude. Measured heat flow tends to decrease downhole at four sites, suggesting the presence of heat sources within the sediments. These results are consistent with the flow of warm fluid through the complex along sub-horizontal, high-permeability conduits, including thrust faults, the major decollement zone, and sandy intervals. Simple calculations suggest that this flow is transient, occurring on time scales of tens to tens of thousands of years. High heat flow in the vicinity of 15°30'N and not elsewhere along the deformation front suggests that the Leg 110 drill sites may be situated over a fluid discharge zone, with dewatering more active here than elsewhere along the accretionary complex.
Resumo:
‘De Vries-like’ smectic liquid crystals exhibit low layer contraction of approximately 1% on transitions from the SmA to the SmC phase. These materials have received considerable attention as potential solutions for problems affecting liquid crystal displays using surface-stabilized ferroelectric liquid crystals (SSFLC). In SSFLCs, layer contraction of 710% is normally observed during the SmA to SmC phase transition. A study by the Lemieux group has shown that liquid crystals with nanosegregating carbosilane segments exhibit enhanced ‘de Vries-like’ properties through the formation of smectic layers and by lengthening the nanosegregating carbosilane end-groups from monocarbosilane to tricarbosilane. This observed enhancement is assumed to be due to an increase in the cross-section of the free volume in the hydrocarbon sub-layer. To test this hypothesis, it is assumed that dimers with a tricarbosilane linking group have smaller cross-sections on time average. In his thesis, this hypothesis is tested through the characterization of new liquid crystalline monomers (QL39-n) and dimers (QL40-n) with 2-phenylpyrimidine cores and tricarbosilane end-groups and spacers, respectively. The thesis describes the synthesis of two homologous series of liquid crystals and their characterization using a variety of techniques, including polarized optical microscopy, differential scanning calorimetry and X-ray diffraction. The results show that the monomers QL39-n form a tilted SmC phase only, whereas the dimers QL40-n form an orthogonal SmA phase. These results are discussed in the context of our hypothesis.
Resumo:
Stroke is a leading cause of death and permanent disability worldwide, affecting millions of individuals. Traditional clinical scores for assessment of stroke-related impairments are inherently subjective and limited by inter-rater and intra-rater reliability, as well as floor and ceiling effects. In contrast, robotic technologies provide objective, highly repeatable tools for quantification of neurological impairments following stroke. KINARM is an exoskeleton robotic device that provides objective, reliable tools for assessment of sensorimotor, proprioceptive and cognitive brain function by means of a battery of behavioral tasks. As such, KINARM is particularly useful for assessment of neurological impairments following stroke. This thesis introduces a computational framework for assessment of neurological impairments using the data provided by KINARM. This is done by achieving two main objectives. First, to investigate how robotic measurements can be used to estimate current and future abilities to perform daily activities for subjects with stroke. We are able to predict clinical scores related to activities of daily living at present and future time points using a set of robotic biomarkers. The findings of this analysis provide a proof of principle that robotic evaluation can be an effective tool for clinical decision support and target-based rehabilitation therapy. The second main objective of this thesis is to address the emerging problem of long assessment time, which can potentially lead to fatigue when assessing subjects with stroke. To address this issue, we examine two time reduction strategies. The first strategy focuses on task selection, whereby KINARM tasks are arranged in a hierarchical structure so that an earlier task in the assessment procedure can be used to decide whether or not subsequent tasks should be performed. The second strategy focuses on time reduction on the longest two individual KINARM tasks. Both reduction strategies are shown to provide significant time savings, ranging from 30% to 90% using task selection and 50% using individual task reductions, thereby establishing a framework for reduction of assessment time on a broader set of KINARM tasks. All in all, findings of this thesis establish an improved platform for diagnosis and prognosis of stroke using robot-based biomarkers.
Resumo:
This paper seeks to review the critical role of land in delivering sustainable development, focusing on the supply of affordable homes. It first presents a historical overview of debates on land reform, including nationalisation of development land and betterment, before reviewing the impact of land costs on housing delivery, using London as a case study. It then considers alternative policy approaches to ensuring the most effective use of land resources and development capacity, and sets out a programme embracing planning reform, public land acquisition, disposal and taxation.