960 resultados para Accelerated failure time model
Resumo:
In this study, discrete time one-factor models of the term structure of interest rates and their application to the pricing of interest rate contingent claims are examined theoretically and empirically. The first chapter provides a discussion of the issues involved in the pricing of interest rate contingent claims and a description of the Ho and Lee (1986), Maloney and Byrne (1989), and Black, Derman, and Toy (1990) discrete time models. In the second chapter, a general discrete time model of the term structure from which the Ho and Lee, Maloney and Byrne, and Black, Derman, and Toy models can all be obtained is presented. The general model also provides for the specification of an additional model, the ExtendedMB model. The third chapter illustrates the application of the discrete time models to the pricing of a variety of interest rate contingent claims. In the final chapter, the performance of the Ho and Lee, Black, Derman, and Toy, and ExtendedMB models in the pricing of Eurodollar futures options is investigated empirically. The results indicate that the Black, Derman, and Toy and ExtendedMB models outperform the Ho and Lee model. Little difference in the performance of the Black, Derman, and Toy and ExtendedMB models is detected. ^
Resumo:
Survival models deals with the modelling of time to event data. In certain situations, a share of the population can no longer be subjected to the event occurrence. In this context, the cure fraction models emerged. Among the models that incorporate a fraction of cured one of the most known is the promotion time model. In the present study we discuss hypothesis testing in the promotion time model with Weibull distribution for the failure times of susceptible individuals. Hypothesis testing in this model may be performed based on likelihood ratio, gradient, score or Wald statistics. The critical values are obtained from asymptotic approximations, which may result in size distortions in nite sample sizes. This study proposes bootstrap corrections to the aforementioned tests and Bartlett bootstrap to the likelihood ratio statistic in Weibull promotion time model. Using Monte Carlo simulations we compared the nite sample performances of the proposed corrections in contrast with the usual tests. The numerical evidence favors the proposed corrected tests. At the end of the work an empirical application is presented.
Resumo:
Todas as organizações deveriam preocupar-se com a análise dos custos da qualidade, dado que essa análise, para além de permitir identificar aspetos a melhorar, é uma ferramenta fundamental para os próprios órgãos de gestão dessas organizações. Esta análise sobre os custos da qualidade também deveria incidir sobre as atividades da empresa relacionadas com a sua prática fiscal. Porém, a literatura não apresenta qualquer referência à relação entre essas duas temáticas: custos da qualidade e fiscalidade empresarial. Nesse sentido, o presente trabalho de investigação analisa a relação entre os princípios dos custos da qualidade e a fiscalidade empresarial em Portugal. Pelo que, optou-se pela metodologia case study, mais especificamente pela metodologia comparative case study, por se entender, e se ter demonstrado, ser a metodologia que melhor se adequa à complexidade do tema em análise. Este trabalho, para além de relacionar os custos da qualidade e a fiscalidade empresarial, permitiu apresentar e aplicar uma metodologia para implementação do modelo Prevention – Appraisal – Faillure (PAF), com o objetivo de diminuir os custos da qualidade na prática fiscal e atingir o nível económico da qualidade, bem como um índice de eficiência, que permite, a todo o momento, determinar o nível de eficiência atingido e a forma de o melhorar. Nesse sentido, concluiu-se que a generalidade das empresas portuguesas não aplica os princípios dos custos da qualidade ao seu departamento fiscal ou à sua prática fiscal, quer essa atividade seja executada internamente na empresa, quer seja executada externamente; Costs related to the quality of fiscal practice in Portuguese firms. Comparative case study Abstract: Every organization should be concerned about analyzing its quality costs, since that analysis, besides allowing identification of aspects to improve is a fundamental tool for the management organs of those organizations. This analysis of quality costs should also be carried out on firms’ activities related to their fiscal practice. However, no reference is found in the literature to the relationship between these two: quality costs and business taxation. This research analyzes the relationship between the principles of quality costs and business taxation in Portugal. So being, and to carry out this study, the case study methodology was chosen, more specifically the comparative case study methodology, through the understanding, and previous demonstration, that it is the most appropriate methodology for the complexity of the subject analyzed. Besides relating quality costs to business taxation, this study allowed presentation and application of a methodology for implementing the Prevention – Appraisal – Failure (PAF) model in companies’ fiscal practice which decreases the costs of this practice, reach the economic level of quality as well as an efficiency index, which allows at any time to determine the achieved level of efficiency and how to improve it. All in all, what this study demonstrated is that Portuguese companies, in general, do not apply the principles of quality costs to their taxation department or fiscal practice, whether that activity is performed internally in the firm or externally.
Resumo:
Doctor of Philosophy in Mathematics
Resumo:
Intracochlear trauma from surgical insertion of bulky electrode arrays and inadequate pitch perception are areas of concern with current hand-assembled commercial cochlear implants. Parylene thin-film arrays with higher electrode densities and lower profiles are a potential solution, but lack rigidity and hence depend on manually fabricated permanently attached polyethylene terephthalate (PET) tubing based bulky backing devices. As a solution, we investigated a new backing device with two sub-systems. The first sub-system is a thin poly(lactic acid) (PLA) stiffener that will be embedded in the parylene array. The second sub-system is an attaching and detaching mechanism, utilizing a poly(N-vinylpyrrolidone)-block-poly(d,l-lactide) (PVP-b-PDLLA) copolymer-based biodegradable and water soluble adhesive, that will help to retract the PET insertion tool after implantation. As a proof-of-concept of sub-system one, a microfabrication process for patterning PLA stiffeners embedded in parylene has been developed. Conventional hotembossing, mechanical micromachining, and standard cleanroom processes were integrated for patterning fully released and discrete stiffeners coated with parylene. The released embedded stiffeners were thermoformed to demonstrate that imparting perimodiolar shapes to stiffener-embedded arrays will be possible. The developed process when integrated with the array fabrication process will allow fabrication of stiffener-embedded arrays in a single process. As a proof-of-concept of sub-system two, the feasibility of the attaching and detaching mechanism was demonstrated by adhering 1x and 1.5x scale PET tube-based insertion tools and PLA stiffeners embedded in parylene using the copolymer adhesive. The attached devices survived qualitative adhesion tests, thermoforming, and flexing. The viability of the detaching mechanism was tested by aging the assemblies in-vitro in phosphate buffer solution. The average detachment times, 2.6 minutes and 10 minutes for 1x and 1.5x scale devices respectively, were found to be clinically relevant with respect to the reported array insertion times during surgical implantation. Eventually, the stiffener-embedded arrays would not need to be permanently attached to current insertion tools which are left behind after implantation and congest the cochlear scala tympani chamber. Finally, a simulation-based approach for accelerated failure analysis of PLA stiffeners and characterization of PVP-b-PDLLA copolymer adhesive has been explored. The residual functional life of embedded PLA stiffeners exposed to body-fluid and thereby subjected to degradation and erosion has been estimated by simulating PLA stiffeners with different parylene coating failure types and different PLA types for a given parylene coating failure type. For characterizing the PVP-b-PDLLA copolymer adhesive, several formulations of the copolymer adhesive were simulated and compared based on the insertion tool detachment times that were predicted from the dissolution, degradation, and erosion behavior of the simulated adhesive formulations. Results indicate that the simulation-based approaches could be used to reduce the total number of time consuming and expensive in-vitro tests that must be conducted.
Resumo:
The main purpose of this work was to study the germination of Ternstroemia brasiliensis seeds both in laboratory and field conditions in order to contribute to understanding the regeneration ecology of the species. The seeds were dispersed with relatively high moisture content and exhibit a recalcitrant storage behaviour because of their sensitivity to dehydration and to dry storage. The germinability is relatively high and is not affected either by light or aril presence. The absence of the dormancy and the low sensitivity to far red light can enable to seeds to promptly germinate under Restinga forest canopy, not forming a soil seed bank. The constant temperatures of 25 ºC and 30 ºC were considered optimum for germination of T. brasiliensis seeds. Temperature germination parameters can be affected by light conditions. The thermal-time model can be a suitable tool for investigating the temperature dependence on the seed germination of T. brasiliensis. The germination characteristics de T. brasiliensis are typical of non pioneer species, and help to explain the distribution of the species. Germination of T. brasiliensis seeds in Restinga environment may be not limited by light and temperature; otherwise the soil moisture content can affect the seed germination.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
We study in detail the so-called beta-modified Weibull distribution, motivated by the wide use of the Weibull distribution in practice, and also for the fact that the generalization provides a continuous crossover towards cases with different shapes. The new distribution is important since it contains as special sub-models some widely-known distributions, such as the generalized modified Weibull, beta Weibull, exponentiated Weibull, beta exponential, modified Weibull and Weibull distributions, among several others. It also provides more flexibility to analyse complex real data. Various mathematical properties of this distribution are derived, including its moments and moment generating function. We examine the asymptotic distributions of the extreme values. Explicit expressions are also derived for the chf, mean deviations, Bonferroni and Lorenz curves, reliability and entropies. The estimation of parameters is approached by two methods: moments and maximum likelihood. We compare by simulation the performances of the estimates from these methods. We obtain the expected information matrix. Two applications are presented to illustrate the proposed distribution.
Resumo:
The effect of hydration (priming) treatment on dormancy release in annual ryegrass seeds from two populations was investigated. Hydration duration, number, and timing with respect to after-ripening were compared in an experiment involving 15 treatment regimens for 12 wk. Seeds were hydrated at 100% relative humidity for 0, 2, or 10 d at Weeks 1, 6, or 12 of after-ripening. Dormancy status was assessed after each hydration treatment by measuring seed germination at 12-hourly alternating 25/15 C (light/dark) periods using seeds directly from the hydration treatment and seeds subjected to 4 d postpriming desiccation. Seeds exposed to one or more hydration events during the 12 wk were less dormant than seeds that remained dry throughout after-ripening. The longer hydration of 10 d promoted greater dormancy loss than either a 2-d hydration or no hydration. For the seed lot that was most dormant at the start of the experiment, two or three rather than one hydration event or a hydration event earlier rather than later during after-ripening promoted greater dormancy release. These effects were not significant for the less-dormant seed lot. For both seed lots, the effect of a single hydration for 2 d at Week 1 or 6 of after-ripening was not manifested until the test at Week 12 of the experiment, suggesting that the hydration events alter the rate of dormancy release during subsequent after-ripening. A hydrothermal priming time model, usually used for modeling the effect of priming on germination rate of nondormant seeds, was successfully applied to dormancy release resulting from the hydration treatments.
Resumo:
The present study evaluated the benefits of phonological processing skills training for children with persistent reading difficulties. Children aged between 9-14 years, identified as having a specific reading disability, participated in the study. In a series of three experiments, pedagogical issues related to length of training time, model of intervention and severity of readers' phonological processing skills deficit prior to intervention, were explored. The results indicated that improvement in poor readers' phonological processing skills led to a dramatic improvement in their reading accuracy and reading comprehension performance. Increasing the length of training time significantly improved transfer effects to the reading process. Children with particularly severe phonological processing skill deficits benefited from art extended training period, and both individual and group intervention models for phonological processing training proved successful. Implications for speech and language therapists are discussed.
Resumo:
Different routes for the administration of bone marrow-derived cells (BMDC) have been proposed to treat the progression of chronic renal failure (CRF). We investigated whether (1) the use of bovine pericardium (BP) as a scaffold for cell therapy would retard the progression of CAF and (2) the efficacy of cell therapy differently impacts distinct degrees of CRF. We used 2/3 and 5/6 models of renal mass reduction to simulate different stages of chronicity. Treatments consisted of BP seeded with either mesenchymal or mononuclear cells implanted in the parenchyma of remnant kidney. Renal function and proteinuria were measured at days 45 and 90 after cell implantation. BMDC treatment reduced glomerulosclerosis, interstitial fibrosis and lymphocytic infiltration. Immunohistochemistry showed decreased macrophage accumulation, proliferative activity and the expression of fibronectin and alpha-smooth muscle-actin. Our results demonstrate: (1) biomaterial combined with BMDC did retard the progression of experimental CRF; (2) cellular therapy stabilized serum creatinine (sCr), improved creatinine clearance and 1/sCr slope when administered during the less severe stages of CRF; (3) treatment with combined therapy decreased glomerulosclerosis, fibrosis and the expression of fibrogenic molecules; and (4) biomaterials seeded with BMDC can be an alternative route of cellular therapy.
Resumo:
Objectives: This study compared the reliability and fracture patterns of zirconia cores veneered with pressable porcelain submitted to either axial or off-axis sliding contact fatigue. Methods: Forty-two Y-TZP plates (12 mm x 12 mm x 0.5 mm) veneered with pressable porcelain (12 mm x 12 mm x 1.2 mm) and adhesively luted to water aged composite resin blocks (12 mm x 12 mm x 4 mm) were stored in water at least 7 days prior to testing. Profiles for step-stress fatigue (ratio 3:2:1) were determined from single load to fracture tests (n = 3). Fatigue loading was delivered on specimen either on axial (n = 18) or off-axis 30 degrees angulation (n = 18) to simulate posterior tooth cusp inclination creating a 0.7 mm slide. Single load and fatigue tests utilized a 6.25 mm diameter WC indenter. Specimens were inspected by means of polarized-light microscope and SEM. Use level probability Weibull curves were plotted with 2-sided 90% confidence bounds (CB) and reliability for missions of 50,000 cycles at 200 N (90% CB) were calculated. Results: The calculated Weibull Beta was 3.34 and 2.47 for axial and off-axis groups, respectively, indicating that fatigue accelerated failure in both loading modes. The reliability data for a mission of 50,000 cycles at 200 N load with 90% CB indicates no difference between loading groups. Deep penetrating cone cracks reaching the core-veneer interface were observed in both groups. Partial cones due to the sliding component were observed along with the cone cracking for the off-axis group. No Y-TZP core fractures were observed. Conclusions: Reliability was not significantly different between axial and off-axis mouth-motion fatigued pressed over Y-TZP cores, but incorporation of sliding resulted in more aggressive damage on the veneer. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This paper presents a predictive optimal matrix converter controller for a flywheel energy storage system used as Dynamic Voltage Restorer (DVR). The flywheel energy storage device is based on a steel seamless tube mounted as a vertical axis flywheel to store kinetic energy. The motor/generator is a Permanent Magnet Synchronous Machine driven by the AC-AC Matrix Converter. The matrix control method uses a discrete-time model of the converter system to predict the expected values of the input and output currents for all the 27 possible vectors generated by the matrix converter. An optimal controller minimizes control errors using a weighted cost functional. The flywheel and control process was tested as a DVR to mitigate voltage sags and swells. Simulation results show that the DVR is able to compensate the critical load voltage without delays, voltage undershoots or overshoots, overcoming the input/output coupling of matrix converters.
Resumo:
Nowadays, many real-time operating systems discretize the time relying on a system time unit. To take this behavior into account, real-time scheduling algorithms must adopt a discrete-time model in which both timing requirements of tasks and their time allocations have to be integer multiples of the system time unit. That is, tasks cannot be executed for less than one time unit, which implies that they always have to achieve a minimum amount of work before they can be preempted. Assuming such a discrete-time model, the authors of Zhu et al. (Proceedings of the 24th IEEE international real-time systems symposium (RTSS 2003), 2003, J Parallel Distrib Comput 71(10):1411–1425, 2011) proposed an efficient “boundary fair” algorithm (named BF) and proved its optimality for the scheduling of periodic tasks while achieving full system utilization. However, BF cannot handle sporadic tasks due to their inherent irregular and unpredictable job release patterns. In this paper, we propose an optimal boundary-fair scheduling algorithm for sporadic tasks (named BF TeX ), which follows the same principle as BF by making scheduling decisions only at the job arrival times and (expected) task deadlines. This new algorithm was implemented in Linux and we show through experiments conducted upon a multicore machine that BF TeX outperforms the state-of-the-art discrete-time optimal scheduler (PD TeX ), benefiting from much less scheduling overheads. Furthermore, it appears from these experimental results that BF TeX is barely dependent on the length of the system time unit while PD TeX —the only other existing solution for the scheduling of sporadic tasks in discrete-time systems—sees its number of preemptions, migrations and the time spent to take scheduling decisions increasing linearly when improving the time resolution of the system.
Resumo:
Nos últimos anos tem-se assistido à introdução de novos dispositivos de medição da poluição do ar baseados na utilização de sensores de baixo custo. A utilização menos complexa destes sistemas, possibilita a obtenção de dados com elevada resolução temporal e espacial, abrindo novas oportunidades para diferentes metodologias de estudos de monitorização da poluição do ar. Apesar de apresentarem capacidades analíticas distantes dos métodos de referência, a utilização destes sensores tem sido sugerida e incentivada pela União Europeia no âmbito das medições indicativas previstas na Diretiva 2008/50/CE, com uma incerteza expandida máxima de 25%. O trabalho desenvolvido no âmbito da disciplina de Projeto consistiu na escolha, caracterização e utilização em medições reais de um sensor de qualidade do ar, integrado num equipamento protótipo desenvolvido com esse fim, visando obtenção uma estimativa da incerteza de medição associada à utilização deste dispositivo através da aplicação da metodologia de demonstração de equivalência de métodos de medição de qualidade do ar definida pela União Europeia. A pesquisa bibliográfica realizada permitiu constatar que o monóxido de carbono é neste momento o parâmetro de qualidade do ar que permite ser medido de forma mais exata através da utilização de sensores, nomeadamente o sensor eletroquímico da marca Alphasense, modelo COB4, amplamente utilizado em projetos de desenvolvimento neste cotexto de monitorização ambiental. O sensor foi integrado num sistema de medição com o objetivo de poder ser utlizado em condições de autonomia de fornecimento de energia elétrica, aquisição interna dos dados, tendo em consideração ser o mais pequeno possível e de baixo custo. Foi utlizado um sistema baseado na placa Arduino Uno com gravação de dados em cartão de memória SD, baterias e painel solar, permitindo para além do registo das tensões elétricas do sensor, a obtenção dos valores de temperatura, humidade relativa e pressão atmosférica, com um custo global a rondar os 300 euros. Numa primeira fase foram executados um conjunto de testes laboratoriais que permitiram a determinação de várias características de desempenho em dois sensores iguais: tempo de resposta, a equação modelo do sensor, avaliação da repetibilidade, desvio de curto e longo termo, interferência da temperatura e histerese. Os resultados demonstraram um comportamento dos sensores muito linear, com um tempo de resposta inferior a um minuto e com uma equação modelo do sensor dependente da variação da temperatura. A estimativa da incerteza expandida laboratorial ficou, para ambos os sensores, abaixo dos 10%. Após a realização de duas campanhas reais de medição de CO em que os valores foram muito baixos, foi realizada uma campanha de quinze dias num parque de estacionamento subterrâneo que permitiu a obtenção de concentrações suficientemente elevadas e a comparação dos resultados dos sensores com o método de referência em toda a gama de medição (0 a 12 mol.mol-1). Os valores de concentração obtidos pelos dois sensores demonstraram uma excelente correlação com o método de referência (r2≥0,998), obtendo-se resultados para a estimativa da incerteza expandida de campo inferiores aos obtidos para a incerteza laboratorial, cumprindo o objetivo de qualidade de dados definido para as medições indicativas de incerteza expandida máxima de 25%. Os resultados observados durante o trabalho realizado permitiram confirmar o bom desempenho que este tipo de sensor pode ter no âmbito de medições de poluição do ar com um caracter mais indicativo.