863 resultados para Class analysis
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
This thesis is an investigation into the nature of data analysis and computer software systems which support this activity.
The first chapter develops the notion of data analysis as an experimental science which has two major components: data-gathering and theory-building. The basic role of language in determining the meaningfulness of theory is stressed, and the informativeness of a language and data base pair is studied. The static and dynamic aspects of data analysis are then considered from this conceptual vantage point. The second chapter surveys the available types of computer systems which may be useful for data analysis. Particular attention is paid to the questions raised in the first chapter about the language restrictions imposed by the computer system and its dynamic properties.
The third chapter discusses the REL data analysis system, which was designed to satisfy the needs of the data analyzer in an operational relational data system. The major limitation on the use of such systems is the amount of access to data stored on a relatively slow secondary memory. This problem of the paging of data is investigated and two classes of data structure representations are found, each of which has desirable paging characteristics for certain types of queries. One representation is used by most of the generalized data base management systems in existence today, but the other is clearly preferred in the data analysis environment, as conceptualized in Chapter I.
This data representation has strong implications for a fundamental process of data analysis -- the quantification of variables. Since quantification is one of the few means of summarizing and abstracting, data analysis systems are under strong pressure to facilitate the process. Two implementations of quantification are studied: one analagous to the form of the lower predicate calculus and another more closely attuned to the data representation. A comparison of these indicates that the use of the "label class" method results in orders of magnitude improvement over the lower predicate calculus technique.
Resumo:
Familial hypercholesterolemia (FH) is a common autosomal codominant disease with a frequency of 1:500 individuals in its heterozygous form. The genetic basis of FH is most commonly mutations within the LDLR gene. Assessing the pathogenicity of LDLR variants is particularly important to give a patient a definitive diagnosis of FH. Current studies of LDLR activity ex vivo are based on the analysis of I-125-labeled lipoproteins (reference method) or fluorescent-labelled LDL. The main purpose of this study was to compare the effectiveness of these two methods to assess LDLR functionality in order to validate a functional assay to analyse LDLR mutations. LDLR activity of different variants has been studied by flow cytometry using FITC-labelled LDL and compared with studies performed previously with I-125-labeled lipoproteins. Flow cytometry results are in full agreement with the data obtained by the I-125 methodology. Additionally confocal microscopy allowed the assignment of different class mutation to the variants assayed. Use of fluorescence yielded similar results than I-125-labeled lipoproteins concerning LDLR activity determination, and also allows class mutation classification. The use of FITC-labelled LDL is easier in handling and disposal, cheaper than radioactivity and can be routinely performed by any group doing LDLR functional validations.
Resumo:
In the problem of one-class classification (OCC) one of the classes, the target class, has to be distinguished from all other possible objects, considered as nontargets. In many biomedical problems this situation arises, for example, in diagnosis, image based tumor recognition or analysis of electrocardiogram data. In this paper an approach to OCC based on a typicality test is experimentally compared with reference state-of-the-art OCC techniques-Gaussian, mixture of Gaussians, naive Parzen, Parzen, and support vector data description-using biomedical data sets. We evaluate the ability of the procedures using twelve experimental data sets with not necessarily continuous data. As there are few benchmark data sets for one-class classification, all data sets considered in the evaluation have multiple classes. Each class in turn is considered as the target class and the units in the other classes are considered as new units to be classified. The results of the comparison show the good performance of the typicality approach, which is available for high dimensional data; it is worth mentioning that it can be used for any kind of data (continuous, discrete, or nominal), whereas state-of-the-art approaches application is not straightforward when nominal variables are present.
Resumo:
Inter and intra-annual variation in year-class strength was analyzed for San Francisco Bay Pacific herring (Clupea pallasi) by using otoliths of juveniles. Juvenile herring were collected from March through June in 1999 and 2000 and otoliths from subsamples of these collections were aged by daily otolith increment analysis. The composition of the year classes in 1999 and 2000 were determined by back-calculating the birth date distribution for surviving juvenile herring. In 2000, 729% more juveniles were captured than in 1999, even though an estimated 12% fewer eggs were spawned in 2000. Spawning-date distributions show that survival for the 2000 year class was exceptionally good for a short (approximately 1 month) period of spawning, resulting in a large abundance of juvenile recruits. Analysis of age at size shows that growth rate increased significantly as the spawning season progressed both in 1999 and 2000. However, only in 2000 were the bulk of surviving juveniles a product of the fast growth period. In the two years examined, year-class strength was not predicted by the estimated number of eggs spawned, but rather appeared to depend on survival of eggs or larvae (or both) through the juvenile stage. Fast growth through the larval stage may have little effect on year-class strength if mortality during the egg stage is high and few larvae are available.
Resumo:
Management of the Texas penaeid shrimp fishery is aimed at increasing revenue from brown shrimp, Penaeus aztecus, landings and decreasing the level of discards. Since 1960 Texas has closed its territorial sea for 45-60 days during peak migration of brown shrimp to the Gulf of Mexico. In 1981 the closure was extended to 200 miles to include the U.S. Exclusive Economic Zone. Simulation modeling is used in this paper to estimate the changes in landings, revenue, costs, and economic rent attributable to the Texas closure. Four additional analyses were conducted to estimate the effects of closing the Gulf 1- to 4-fathom zone for 45 and 60 days, with and without effort redirected to inshore waters. Distributional impacts are analyzed in terms of costs, revenues, and rents, by vessel class, shrimp species, vessel owner, and crew.
Resumo:
Tag release and recapture data of bigeye (Thunnus obesus) and yellowfin tuna (T. albacares) from the Hawaii Tuna Tagging Project (HTTP) were analyzed with a bulk transfer model incorporating size-specific attrition to infer population dynamics and transfer rates between various fishery components. For both species, the transfer rate estimates from the offshore handline fishery areas to the longline fishery area were higher than the estimates of transfer from those same areas into the inshore fishery areas. Natural and fishing mortality rates were estimated over three size classes: yellowfin 20–45, 46–55, and ≥56 cm and bigeye 29–55, 56–70, and ≥71 cm. For both species, the estimates of natural mortality were highest in the smallest size class. For bigeye tuna, the estimates decreased with increasing size and for yellowfin tuna there was a slight increase in the largest size class. In the Cross Seamount fishery, the fishing mortality rate of bigeye tuna was similar for all three size classes and represented roughly 12% of the gross attrition rate (includes fishing and natural mortality and emigration rates). For yellowfin tuna, fishing mortality ranged between 7% and 30%, the highest being in the medium size class. For both species, the overall attrition rate from the entire fishery area was nearly the same. However, in the specific case of the Cross Seamount fishery, the attrition rate for yellowfin tuna was roughly twice that for bigeye. This result indicates that bigeye tuna are more resident at the Seamount than yellowfin tuna, and larger bigeye tunas tend to reside longer than smaller individuals. This may result in larger fish being more vulnerable to capture in the Seamount fishery. The relatively low level of exchange between the Sea-mount and the inshore and longline fisheries suggests that the fishing activity at the Seamount need not be of great management concern for either species. However, given that the current exploitation rates are considered moderate (10–30%), and that Seamount aggregations of yellowfin and bigeye tuna are highly vulnerable to low-cost gear types, it is recommended that further increases in fishing effort for these species be monitored at Cross Seamount.
Resumo:
Containers are structured m-files which allow `data' and `methods' to be stored persistently. Containers have a user-defined class structure, so that one can have several Containers of the same class, all structurally similar, and there is a mechanism for interaction with Containers in the style of database transactions. The use of MATLAB Containers to analyze multivariable Smith predictors is discussed.
Resumo:
Two fish species each from carnivorous (Clarias batrachus, Channa punctatus), omnivorous (Cyprinus carpio, Cirrhinus reba), and plankton feeder (Catla catla, Labeo rohita) were collected from freshwater sources under natural habitat to study their total lipid (TL) and lipid-fractions. Significant relationship between these parameters was also worked out. The variation of total lipid and lipid-fractions in tissues of freshwater fishes were not significantly different (P>0.05). But a higher trend of total lipid and glyceride (TGL) contents were found in carnivores followed by omnivores and least in plankton feeders. The trend was reverse for total phospholipid, cholesterol and free fatty acids. TGL content in all class of fishes was significantly related with TL (P<0.01), phospholipid (PL) (P<0.001), cholesterol (P<0.05), free fatty acids (P<0.05) and monoglycerides (P<0.001). Similarly total lipid was linearly related with total glycerides (TL=-3.02 + 0.10 TGL) and phospholipid (TL=7.13-0.12 PL). From this study it is concluded that almost all lipid-fractions of freshwater fishes can be predicted easily from total lipid content of the tissue.
Resumo:
Major histocompatibility complex genes are thought to be involved in allogeneic graft rejection but not many reports are available on their functional analysis in fish. Analysis of available sequences of MHC genes suggests functions in antigen presentation similar to those found in higher vertebrates. In mammals, the MHC class I and class II molecules are major determinants of allogeneic graft rejection due to their polymorphism in conjunction with their antigen presenting function. In fish, MHC class H molecules are found to be involved in rejection of allogeneic scale grafts. The present study was designed to investigate the involvement of MHC class I molecules in allograft rejection. Erythrocytes were collected from donors of rainbow trout expressed different class MHC class I alleles, stained with two dyes, mixed and grafted to the recipients that were of the same sibling group as the donors. The grafts were rejected by allogeneic recipients and the MHC class I linkage group was the major determinant for the rejection.
Resumo:
This paper presents an incremental learning solution for Linear Discriminant Analysis (LDA) and its applications to object recognition problems. We apply the sufficient spanning set approximation in three steps i.e. update for the total scatter matrix, between-class scatter matrix and the projected data matrix, which leads an online solution which closely agrees with the batch solution in accuracy while significantly reducing the computational complexity. The algorithm yields an efficient solution to incremental LDA even when the number of classes as well as the set size is large. The incremental LDA method has been also shown useful for semi-supervised online learning. Label propagation is done by integrating the incremental LDA into an EM framework. The method has been demonstrated in the task of merging large datasets which were collected during MPEG standardization for face image retrieval, face authentication using the BANCA dataset, and object categorisation using the Caltech101 dataset. © 2010 Springer Science+Business Media, LLC.
Resumo:
A new metalloproteinase-disintegrin, named Jerdonitin, was purified from Trimeresurus jerdonii venom with a molecular weight of 36 kDa on SDS-PAGE. It dose-dependently inhibited ADP-induced human platelet aggregation with IC50 of 120 nM. cDNA cloning and sequencing revealed that Jerdonitin belonged to the class II of snake venom metalloproteinases (SVMPs) (P-II class). Different from other P-II class SVMPs, metalloproteinase and disintegrin domains of its natural protein were not separated, confirmed by internal peptide sequencing. Compared to other P-II class SVMPs, Jerdonitin has two additional cysteines (Cys219 and Cys238) located in the spacer domain and disintegrin domain, respectively. They probably form a disulfide bond and therefore the metalloproteinase and disintegrin domains cannot be separated by posttranslationally processing. In summary, comparison of the amino acid sequences of Jerdonitin with those of other P-II class SVMPs by sequence alignment and phylogenetic analysis, in conjunction with natural protein structure data, suggested that it was a new type of P-II class SVMPs. (C) 2003 Elsevier Inc. All rights reserved.
Resumo:
Four novel highly oxygenated trinortriterpenoids, sphenalactones A-D (1-4), were isolated from the leaves and stems of Schisandra sphenanthera and their structures were elucidated by extensive analysis of 1D and 2D NMR data. Compounds 1-4 featured a C-27
Resumo:
In this paper, phase noise analysis of a mechanical autonomous impact oscillator with a MEMS resonator is performed. Since the circuit considered belongs to the class of hybrid systems, methods based on the variational model for the evaluation of either phase noise or steady state solutions cannot be directly applied. As a matter of fact, the monodromy matrix is not defined at impact events in these systems. By introducing saltation matrices, this limit is overcome and the aforementioned methods are extended. In particular, the unified theory developed by Demir is used to analyze the phase noise after evaluating the asymptotically stable periodic solution of the system by resorting to the shooting method. Numerical results are presented to show how noise sources affect the phase noise performances. © 2011 IEEE.
Resumo:
We identified a new class of human immunodeficiency virus type 1 (HIV-1) recombinants (00CN-HH069 and 00CN-HH086) in which further recombination occurred between two established circulating recombinant forms (CRFs). These two isolates were found among 57 HIV-1 samples from a cohort of injecting drug users in eastern Yunnan Province of China. Informative-site analysis in conjunction with bootscanning plots and exploratory tree analysis revealed that these two strains were closely related mosaics comprised of CRF07_BC and CRF08_BC, which are found in China. The genotype screening based on gag-reverse transcriptase sequences if 57 samples from eastern Yunnan identified 47 CRF08_BC specimens (82.5%), 5 CRF07_BC specimens (8.8%), and 3 additional specimens with the novel recombinant structure. These new "second-generation" recombinants thus constitute a substantial proportion (5 of 57; 8.8%) of HIV-1 strains in this population and may belong to a new but yet-undefined class of CRF. This might be the first example of CRFs recombining with each other, leading to the evolution of second-generation inter-CRF recombinants.