997 resultados para Time Distortion


Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho cujo título é Implementação do ABC numa empresa prestadora de serviços de Saúde, tem como finalidade a obtenção do grau de licenciatura em Contabilidade e Administração e tem como principal objectivo a implementação do método ABC numa pequena e média empresa de prestação de serviços de saúde, como um instrumento de apoio á gestão. Para a introdução da Contabilidade de Gestão na empresa, há que se escolher um método/sistema de apuramento de gastos que espelha a realidade da empresa, e de uma certa forma o ABC é o método ideal para apuramento de resultados sem distorções. O ABC (Activity-Based Cost) apura os resultados através da relação de causa-efeito, considerando que as actividades é que geram gastos e os objectos de custeio é que consomem as actividades. É aplicável tanto nas empresas industriais como nas empresas prestadoras de serviços, apesar de inicialmente ter sido concebido para as empresas industrias, isto é, para as grandes empresas devido aos avultados recursos financeiros e humanos como também pelo tempo necessário para a sua implementação. Mas o modelo matricial apresentado por Roztcki et al (1999) permite a aplicação deste método nas PME com poucos recursos financeiros e de tempo, utilizando uma folha de cálculo no Excel. Será este modelo a ser proposto e poderá ser implementado na clínica. O modelo apresentado foi testado num estudo de caso realizado numa clínica. Com a realização dos testes foi detectado algumas dificuldades e limitações, as maiores dificuldades encontradas foram a identificação das actividades e dos cost drivers, devido à complexidade do sector. A implementação foi concluída com sucesso, proporcionando informações detalhadas dos gastos dos produtos/serviços prestados em toda a clínica. This work was done as a requisite for obtaining a degree in Accounting and Administration, and is titled “The Implementation of ABC – Activity Based Cost in a company that provides health services”. Its main purpose is to analyze the implementation of ABC method in a small and medium-sized enterprise which provides health services to support decision making by the Managers. To adopt management accounting in a company, it’s necessary to choose a cost qualifying system that reflects the reality of the company and in a certain way ABC is the method which can determine the results without any distortion. ABC (Activity-Based Cost) determines the results through cause-and-effect relationship, whereas the activities generate spending while costing objects consume the activities. It’s applicable both in industrial companies as in services providers, although it was initially designed for industrial companies, that is, to large companies, due to the huge financial and human resources existent as well as by the time required for its implementation. But the matrix model presented by Roztckiet al (1999) allows application of this method in small and medium-sized enterprises with limited financial resources and time, using a spreadsheet in Excel. This model will be proposed and could be implemented in any clinic. The model was tested in a case study, undertaken in a private clinic. With the realization of the tests, some problems and limitations were detected, and the major difficulties encountered were the identification of activities and cost drivers, due to the complexity of the sector. The implementation was completed successfully, providing detailed information of the products services spending throughout the clinic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cancer is a major health issue that absorbs the attention of a large part of the biomedical research. Intercalating agents bind to DNA molecules and can inhibit their synthesis and transcription; thus, they are increasingly used as drugs to fight cancer. In this work, we show how atomic force microscopy in liquid can characterize, through time-lapse imaging, the dynamical influence of intercalating agents on the supercoiling of DNA, improving our understanding of the drug's effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The biology of nymphs and adults of the neotropical pentatomid, Dichelops melacanthus (Dallas), feeding on the natural foods, soybean, Glycine max (L.) Merrill immature pods, and corn, Zea mays L. immature seeds, and on an artificial dry diet, was studied in the laboratory. Nymph developmental time was shorter on the natural foods (ca. 21-22 days) than on the artificial diet (28 days), and most nymphs reached adulthood on the food plants (55% on soybean and 73% on corn) than on the artificial diet (40%). Fresh body weight at adult emergence was similar and higher for females raised as nymphs on the natural foods, compared to females from nymphs raised on the artificial diet; for males, weights were similar on all foods. Mean (female and male) survivorship up to day 20, decreased from 55% on soybean to 40% on corn, down to 0% on the artificial diet. Total longevity for females was higher on soybean, while for males was similar on all foods. About three times more females oviposited on soybean than on corn, but fecundity/female was similar on both foods. On the artificial diet, only one out of 30 females oviposited. Fresh body weight of adults increased significantly during the first week of adult life, and at the end of the 3rd week, weight gain was similar on all foods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper develops a method to solve higher-dimensional stochasticcontrol problems in continuous time. A finite difference typeapproximation scheme is used on a coarse grid of low discrepancypoints, while the value function at intermediate points is obtainedby regression. The stability properties of the method are discussed,and applications are given to test problems of up to 10 dimensions.Accurate solutions to these problems can be obtained on a personalcomputer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When dealing with the design of service networks, such as healthand EMS services, banking or distributed ticket selling services, thelocation of service centers has a strong influence on the congestion ateach of them, and consequently, on the quality of service. In this paper,several models are presented to consider service congestion. The firstmodel addresses the issue of the location of the least number of single--servercenters such that all the population is served within a standard distance,and nobody stands in line for a time longer than a given time--limit, or withmore than a predetermined number of other clients. We then formulateseveral maximal coverage models, with one or more servers per service center.A new heuristic is developed to solve the models and tested in a 30--nodesnetwork.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We obtain minimax lower and upper bounds for the expected distortionredundancy of empirically designed vector quantizers. We show that the meansquared distortion of a vector quantizer designed from $n$ i.i.d. datapoints using any design algorithm is at least $\Omega (n^{-1/2})$ awayfrom the optimal distortion for some distribution on a bounded subset of${\cal R}^d$. Together with existing upper bounds this result shows thatthe minimax distortion redundancy for empirical quantizer design, as afunction of the size of the training data, is asymptotically on the orderof $n^{1/2}$. We also derive a new upper bound for the performance of theempirically optimal quantizer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-lapse geophysical monitoring and inversion are valuable tools in hydrogeology for monitoring changes in the subsurface due to natural and forced (tracer) dynamics. However, the resulting models may suffer from insufficient resolution, which leads to underestimated variability and poor mass recovery. Structural joint inversion using cross-gradient constraints can provide higher-resolution models compared with individual inversions and we present the first application to time-lapse data. The results from a synthetic and field vadose zone water tracer injection experiment show that joint 3-D time-lapse inversion of crosshole electrical resistance tomography (ERT) and ground penetrating radar (GPR) traveltime data significantly improve the imaged characteristics of the point injected plume, such as lateral spreading and center of mass, as well as the overall consistency between models. The joint inversion method appears to work well for cases when one hydrological state variable (in this case moisture content) controls the time-lapse response of both geophysical methods. Citation: Doetsch, J., N. Linde, and A. Binley (2010), Structural joint inversion of time-lapse crosshole ERT and GPR traveltime data, Geophys. Res. Lett., 37, L24404, doi: 10.1029/2010GL045482.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a test of the predictive validity of various classes ofQALY models (i.e., linear, power and exponential models). We first estimatedTTO utilities for 43 EQ-5D chronic health states and next these states wereembedded in health profiles. The chronic TTO utilities were then used topredict the responses to TTO questions with health profiles. We find that thepower QALY model clearly outperforms linear and exponential QALY models.Optimal power coefficient is 0.65. Our results suggest that TTO-based QALYcalculations may be biased. This bias can be avoided using a power QALY model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the impact of a minimum price variation (tick) and timepriority on the dynamics of quotes and the trading costs when competitionfor the order flow is dynamic. We find that convergence to competitiveoutcomes can take time and that the speed of convergence is influencedby the tick size, the priority rule and the characteristics of the orderarrival process. We show also that a zero minimum price variation is neveroptimal when competition for the order flow is dynamic. We compare thetrading outcomes with and without time priority. Time priority is shownto guarantee that uncompetitive spreads cannot be sustained over time.However it can sometimes result in higher trading costs. Empiricalimplications are proposed. In particular, we relate the size of thetrading costs to the frequency of new offers and the dynamics of theinside spread to the state of the book.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development time of the immature forms of Sabethes aurescens Lutz, 1905, from perforated bamboo in the southern Brazil rain forest was studied under laboratory conditions. Mean development periods were 5±2.23, 10±5.20, 14±8.26, 36±13.90 and 9±2.43 days, respectively, for the four larval instars and pupae. The 4th instar of females was longer than that of males. Implications of the long development time of the immature forms of Sa. aurescens are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, generalizing results in Alòs, León and Vives (2007b), we see that the dependence of jumps in the volatility under a jump-diffusion stochastic volatility model, has no effect on the short-time behaviour of the at-the-money implied volatility skew, although the corresponding Hull and White formula depends on the jumps. Towards this end, we use Malliavin calculus techniques for Lévy processes based on Løkka (2004), Petrou (2006), and Solé, Utzet and Vives (2007).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.