987 resultados para Chain Ladder Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Minimal residual disease is an important independent prognostic factor in childhood acute lymphoblastic leukemia. The classical detection methods such as multiparameter flow cytometry and real-time quantitative polymerase chain reaction analysis are expensive, time-consuming and complex, and require considerable technical expertise. Design and Methods We analyzed 229 consecutive children with acute lymphoblastic leukemia treated according to the GBTLI-99 protocol at three different Brazilian centers. Minimal residual disease was analyzed in bone marrow samples at diagnosis and on days 14 and 28 by conventional homo/heteroduplex polymerase chain reaction using a simplified approach with consensus primers for IG and TCR gene rearrangements. Results At least one marker was detected by polymerase chain reaction in 96.4%, of the patients. By combining the minimal residual disease results obtained on days 14 and 28, three different prognostic groups were identified: minimal residual disease negative on days 14 and 28, positive on day 14/negative on day 28, and positive on both. Five-year event-free survival rates were 85%, 75.6%,, and 27.8%, respectively (p<0.0001). The same pattern of stratification held true for the group of intensively treated children. When analyzed in other subgroups of patients such as those at standard and high risk at diagnosis, those with positive B-derived CD10, patients positive for the TEL/AML1 transcript, and patients in morphological remission on a day 28 marrow, the event-free survival rate was found to be significantly lower in patients with positive minimal residual disease on day 28. Multivariate analysis demonstrated that the detection of minimal residual disease on day 28 is the most significant prognostic factor. Conclusions This simplified strategy for detection of minimal residual disease was feasible, reproducible, cheaper and simpler when compared with other methods, and allowed powerful discrimination between children with acute lymphoblastic leukemia with a good and poor outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stenotrophomonas maltophilia is a multidrug-resistant nosocomial pathogen that is difficult to identify unequivocally using current methods. Accordingly, because the presence of this microorganism in a patient may directly determine the antimicrobial treatment, conventional polymerase chain reaction (PCR) and real-time PCR assays targeting 23S rRNA were developed for the specific identification of S. maltophilia. The PCR protocol showed high specificity when tested against other species of Stenotrophomonas, non-fermentative Gram-negative bacilli and 100 clinical isolates of S. maltophilia previously identified using the Vitek system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a simple method for detection of Plasmodium vivaxand Plasmodium falciparum infection in anophelines using a triplex TaqMan real-time polymerase chain reaction (PCR) assay (18S rRNA). We tested the assay on Anopheles darlingi and Anopheles stephensi colony mosquitoes fed withPlasmodium-infected blood meals and in duplicate on field collected An. darlingi. We compared the real-time PCR results of colony-infected and field collected An. darlingi, separately, to a conventional PCR method. We determined that a cytochromeb-PCR method was only 3.33% as sensitive and 93.38% as specific as our real-time PCR assay with field-collected samples. We demonstrate that this assay is sensitive, specific and reproducible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interest towards working capital management increased among practitioners and researchers because the financial crisis of 2008 caused the deterioration of the general financial situation. The importance of managing working capital effectively increased dramatically during the financial crisis. On one hand, companies highlighted the importance of working capital management as part of short-term financial management to overcome funding difficulties. On the other hand, in academia, it has been highlighted the need to analyze working capital management from a wider perspective namely from the value chain perspective. Previously, academic articles mostly discussed working capital management from a company-centered perspective. The objective of this thesis was to put working capital management in a wider and more academic perspective and present case studies of the value chains of industries as instrumental in theoretical contributions and practical contributions as complementary to theoretical contributions and conclusions. The principal assumption of this thesis is that selffinancing of value chains can be established through effective working capital management. Thus, the thesis introduces the financial value chain analysis method which is employed in the empirical studies. The effectiveness of working capital management of the value chains is studied through the cycle time of working capital. The financial value chain analysis method employed in this study is designed for considering value chain level phenomena. This method provides a holistic picture of the value chain through financial figures. It extends the value chain analysis to the industry level. Working capital management is studied by the cash conversion cycle that measures the length (days) of time a company has funds tied up in working capital, starting from the payment of purchases to the supplier and ending when remittance of sales is received from the customers. The working capital management practices employed in the automotive, pulp and paper and information and communication technology industries have been studied in this research project. Additionally, the Finnish pharmaceutical industry is studied to obtain a deeper understanding of the working capital management of the value chain. The results indicate that the cycle time of working capital is constant in the value chain context over time. The cash conversion cycle of automotive, pulp and paper, and ICT industries are on average 70, 60 and 40 days, respectively. The difference is mainly a consequence of the different cycle time of inventories. The financial crisis of 2008 affected the working capital management of the industries similarly. Both the cycle time of accounts receivable and accounts payable increased between 2008 and 2009. The results suggest that the companies of the automotive, pulp and paper and ICT value chains were not able to self-finance. Results do not indicate the improvement of value chains position in regard to working capital management either. The findings suggest that companies operating in the Finnish pharmaceutical industry are interested in developing their own working capital management, but collaboration with the value chain partners is not considered interesting. Competition no longer occurs between individual companies, but between value chains. Therefore the financial value chain analysis method introduced in this thesis has the potential to support value chains in improving their competitiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completely absent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involved parts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method is introduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that the theoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approach has reasonable properties from a compositional point of view. In particular, it is “natural” in the sense that it recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in the same paper a substitution method for missing values on compositional data sets is introduced

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Erosion potential and the effects of tillage can be evaluated from quantitative descriptions of soil surface roughness. The present study therefore aimed to fill the need for a reliable, low-cost and convenient method to measure that parameter. Based on the interpretation of micro-topographic shadows, this new procedure is primarily designed for use in the field after tillage. The principle underlying shadow analysis is the direct relationship between soil surface roughness and the shadows cast by soil structures under fixed sunlight conditions. The results obtained with this method were compared to the statistical indexes used to interpret field readings recorded by a pin meter. The tests were conducted on 4-m2 sandy loam and sandy clay loam plots divided into 1-m2 subplots tilled with three different tools: chisel, tiller and roller. The highly significant correlation between the statistical indexes and shadow analysis results obtained in the laboratory as well as in the field for all the soil?tool combinations proved that both variability (CV) and dispersion (SD) are accommodated by the new method. This procedure simplifies the interpretation of soil surface roughness and shortens the time involved in field operations by a factor ranging from 12 to 20.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mestrado em Ciências Actuariais

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Traditionally, the common reserving methods used by the non-life actuaries are based on the assumption that future claims are going to behave in the same way as they did in the past. There are two main sources of variability in the processus of development of the claims: the variability of the speed with which the claims are settled and the variability between the severity of the claims from different accident years. High changes in these processes will generate distortions in the estimation of the claims reserves. The main objective of this thesis is to provide an indicator which firstly identifies and quantifies these two influences and secondly to determine which model is adequate for a specific situation. Two stochastic models were analysed and the predictive distributions of the future claims were obtained. The main advantage of the stochastic models is that they provide measures of variability of the reserves estimates. The first model (PDM) combines one conjugate family Dirichlet - Multinomial with the Poisson distribution. The second model (NBDM) improves the first one by combining two conjugate families Poisson -Gamma (for distribution of the ultimate amounts) and Dirichlet Multinomial (for distribution of the incremental claims payments). It was found that the second model allows to find the speed variability in the reporting process and development of the claims severity as function of two above mentioned distributions' parameters. These are the shape parameter of the Gamma distribution and the Dirichlet parameter. Depending on the relation between them we can decide on the adequacy of the claims reserve estimation method. The parameters have been estimated by the Methods of Moments and Maximum Likelihood. The results were tested using chosen simulation data and then using real data originating from the three lines of business: Property/Casualty, General Liability, and Accident Insurance. These data include different developments and specificities. The outcome of the thesis shows that when the Dirichlet parameter is greater than the shape parameter of the Gamma, resulting in a model with positive correlation between the past and future claims payments, suggests the Chain-Ladder method as appropriate for the claims reserve estimation. In terms of claims reserves, if the cumulated payments are high the positive correlation will imply high expectations for the future payments resulting in high claims reserves estimates. The negative correlation appears when the Dirichlet parameter is lower than the shape parameter of the Gamma, meaning low expected future payments for the same high observed cumulated payments. This corresponds to the situation when claims are reported rapidly and fewer claims remain expected subsequently. The extreme case appears in the situation when all claims are reported at the same time leading to expectations for the future payments of zero or equal to the aggregated amount of the ultimate paid claims. For this latter case, the Chain-Ladder is not recommended.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: The diagnosis of prostate cancer in men with persistently increased prostate specific antigen after a negative prostate biopsy has become a great challenge for urologists and pathologists. We analyzed the diagnostic value of 6 genes in the tissue of patients with prostate cancer. Materials and Methods: The study was comprised of 50 patients with localized disease who underwent radical prostatectomy. Gene selection was based on a previous microarray analysis. Among 4,147 genes with different expressions between 2 pools of patients 6 genes (PSMA, TMEFF2, GREB1, TH1L, IgH3 and PGC) were selected. These genes were tested for diagnostic value using the quantitative reverse transcription polymerase chain reaction method. Initially malignant tissue samples from 33 patients were analyzed and in the second part of the study we analyzed benign tissue samples from the other 17 patients with prostate cancer. The control group was comprised of tissue samples of patients with benign prostatic hyperplasia. Results: Analysis of malignant prostatic tissue demonstrated that prostate specific membrane antigen was over expressed (mean 9 times) and pepsinogen C was under expressed (mean 1.3 X 10(-4) times) in all cases compared to benign prostatic hyperplasia. The other 4 tested genes showed a variable expression pattern not allowing for differentiation between benign and malignant cases. When we tested these results in the benign prostate tissues from patients with cancer, pepsinogen C maintained the expression pattern. In terms of prostate specific membrane antigen, despite over expression in most cases (mean 12 times), 2 cases (12%) presented with under expression. Conclusions: Pepsinogen C tissue expression may constitute a powerful adjunctive method to prostate biopsy in the diagnosis of prostate cancer cases.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Single-stranded DNA (ssDNA) is a prerequisite for electrochemical sensor-based detection of parasite DNA and other diagnostic applications. To achieve this detection, an asymmetric polymerase chain reaction method was optimised. This method facilitates amplification of ssDNA from the human lymphatic filarial parasite Wuchereria bancrofti. This procedure produced ssDNA fragments of 188 bp in a single step when primer pairs (forward and reverse) were used at a 100:1 molar ratio in the presence of double-stranded template DNA. The ssDNA thus produced was suitable for immobilisation as probe onto the surface of an Indium tin oxide electrode and hybridisation in a system for sequence-specific electrochemical detection of W. bancrofti. The hybridisation of the ssDNA probe and target ssDNA led to considerable decreases in both the anodic and the cathodic currents of the system's redox couple compared with the unhybridised DNA and could be detected via cyclic voltammetry. This method is reproducible and avoids many of the difficulties encountered by conventional methods of filarial parasite DNA detection; thus, it has potential in xenomonitoring.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To characterize the microbial etiology of chronic suppurative otitis media comparing the methods of classical bacteriological culture and polymerase chain reaction.Design/Setting/Patients: Bacteriological analysis by classical culture and by molecular polymerase chain reaction of 35 effusion otitis samples from patients with cleft lip and palate attending the Hospital for Rehabilitation of Craniofacial Anomalies of the University of Sao Paulo, Bauru, Brazil.Interventions: Collection of clinical samples of otitis by effusion through the external auditory tube.Main Outcome Measure: Otolaryngologic diagnosis of chronic suppurative otitis media.Results: Positive cultures were obtained from 83% of patients. Among the 31 bacterial lineages the following were isolated. In order of decreasing frequency: Pseudomonas aeruginosa (54.9%), Staphylococcus aureus (25.9%), and Enterococcus faecalis (19.2%). No anaerobes were isolated by culture. The polymerase chain reaction was positive for one or more bacteria investigated in 97.1% of samples. Anaerobe lineages were detected by the polymerase chain reaction method, such as Fusobacterium nucleatum, Bacteroides fragilis, and Peptostreptococcus anaerobius.Conclusions: Patients with cleft lip and palate with chronic suppurative otitis media presented high frequency of bacterial infection in the middle ear. The classical bacteriological culture did not detect strict anaerobes, whose presence was identified by the polymerase chain reaction method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Between 2008 and 2012, commercial Swiss layer and layer breeder flocks experiencing problems in laying performance were sampled and tested for infection with Duck adenovirus A (DAdV-A; previously known as Egg drop syndrome 1976 virus). Organ samples from birds sent for necropsy as well as blood samples from living animals originating from the same flocks were analyzed. To detect virus-specific DNA, a newly developed quantitative real-time polymerase chain reaction method was applied, and the presence of antibodies against DAdV-A was tested using a commercially available enzyme-linked immunosorbent assay. In 5 out of 7 investigated flocks, viral DNA was detected in tissues. In addition, antibodies against DAdV-A were detected in all of the flocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Part of the work of an insurance company is to keep claims reserves, which is known as the technical reserves, in order to mitigate the risk inherent in their activities and comply with the legal obligations. There are several methods for estimate the claims reserves, deterministics and stochastics methods. One of the most used method is the deterministic method Chain Ladder, of simple application. However, the deterministics methods produce only point estimates, for which the stochastics methods have become increasingly popular because they are capable of producing interval estimates, measuring the variability inherent in the technical reserves. In this study the deterministics methods (Grossing Up, Link Ratio and Chain Ladder) and stochastics (Thomas Mack and Bootstrap associated with Overdispersed Poisson model) will be applied to estimate the claims reserves derived from automobile material damage occurred until December 2012. The data used in this research is based on a real database provided by AXA Portugal. The comparison of results obtained by different methods is hereby presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To asses the onset (%) of patella stabilizer muscles during maximal isometric contraction exercises (MIC) in individuals with and without signs of patellofemoral pain syndrome (PFPS) in open (OKC) and closed (CKC) kinetic chain exercises, Method: Assessments were carried out on 22 women; ten with no complains of anterior knee pain, and 12 with PFPS signs during MIC in OKC and CKC with the knee flexed at 90 degrees. The onset of the electromyographic activity of the vastus mediallis obliquus (VMO), vastus lateralis obliquus (VLO) and vastus lateralis longus (VLL) was identified by means of an algorithm in the Myosystem Br 1 software. The statistical analysis used was Chi-Square test and student`s t test, which are both tests with a level of significance at 5%. Results: The VMO and VLO muscles presented a greater onset compared to the VLL during OKC exercises for both groups and for the PFPS group without CCF No differences were observed between the groups. Conclusion: CKC and OKC exercises seem to benefit the synchronism of the musculature that supposedly benefits the patella stabilizer musculature, and can be recommended in physiotherapeutic treatment programs.