794 resultados para Cost analysis
Resumo:
The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.
Resumo:
This study examined the economic potential of fish farming in Abeokuta zone of Ogun State in the 2003 production season. Descriptive statistics cost returns and multiple regression analysis were used in analyzing the data. The farmers predominantly practiced monoculture. Inefficiency in the use of pond size, lime and labour with over-utilization of fingerlings stocked was revealed by the study. The average variable cost of N124.67 constituted 45% of the total while average fixed cost was N149.802.67 per average farm size. Fish farming was found to be a profitable venture in the study area with a net income of N761, 400.58 for an average pond size of 301.47sq.m. Based on these findings, it is suggested that for profit maximization, the fish farm will have to increase the level of their use of fingerlings and fertilizers and decrease the use of lime labour and pond size
Resumo:
This paper focuses on the financial analysis involved in setting up of fish farming on a small-scale in a homestead. About 0.5 acres of land was used for the construction of pond which as a stock of Clarias spp/ Heterobranchus spp and Tilapia spp at the ratio of one to three for a period of 12 months. The land/land development cost is N26,500.00, pond construction cost, N35,700.00, equipment cost, N2,650.00 and stock/Input requirement cost N155,727.00 while the revenue from sales is N376,000.00. A cash flow analysis is also calculated for the fish farm, which is N155,423.00 for first year cash flow, and appropriate profit/mosses were calculated for five-year production cycle of N1,036,515.00 million. At the end appreciable profit is realized from the enterprises. This type of enterprises is viable for small-scale farmers to practices and adopted for financial support for their family
Resumo:
The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.
Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.
Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.
Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.
In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.
Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.
The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.
Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.
Resumo:
4 p.
Resumo:
25 p.
Resumo:
52 p.
Resumo:
Nucleic acids are a useful substrate for engineering at the molecular level. Designing the detailed energetics and kinetics of interactions between nucleic acid strands remains a challenge. Building on previous algorithms to characterize the ensemble of dilute solutions of nucleic acids, we present a design algorithm that allows optimization of structural features and binding energetics of a test tube of interacting nucleic acid strands. We extend this formulation to handle multiple thermodynamic states and combinatorial constraints to allow optimization of pathways of interacting nucleic acids. In both design strategies, low-cost estimates to thermodynamic properties are calculated using hierarchical ensemble decomposition and test tube ensemble focusing. These algorithms are tested on randomized test sets and on example pathways drawn from the molecular programming literature. To analyze the kinetic properties of designed sequences, we describe algorithms to identify dominant species and kinetic rates using coarse-graining at the scale of a small box containing several strands or a large box containing a dilute solution of strands.
Resumo:
A new type of wave-front analysis method for the collimation testing of laser beams is proposed. A concept of wave-front height is defined, and, on this basis, the wave-front analysis method of circular aperture sampling is introduced. The wave-front height of the tested noncollimated wave can be estimated from the distance between two identical fiducial diffraction planes of the sampled wave, and then the divergence is determined. The design is detailed, and the experiment is demonstrated. The principle and experiment results of the method are presented. Owing to the simplicity of the method and its low cost, it is a promising method for checking the collimation of a laser beam with a large divergence. © 2005 Optical Society of America.
Resumo:
STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.
It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.
In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.
Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.
Resumo:
The dispersion compensation effect of the chirped fiber grating (CFG) is analyzed theoretically, and analytic expressions are derived for composite second-order (CSO) distortion in analog modulated sub-carrier multiplexed (AM-SCM) cable television (CATV) systems with externally and directly modulated transmitters. Simulations are given for the two kinds of modulations and for standard single mode fiber and non-zero dispersion shift fiber (NZDSF) systems. The results show that CFG could be used as a dispersion compensator in directly modulated systems, but its dispersion coefficient should be adjusted much more precisely than the externally modulated system. The requirements for the NZDSF system could be loosened much. It is proposed that directly modulated source may be used as a transmitter in CATV systems combined with tunable CFG dispersion compensator being adjusted precisely, which may be more cost-effective than externally modulation technology. (c) 2006 Elsevier GmbH. All rights reserved.
Resumo:
Large numbers of fishing vessels operating from ports in Latin America participate in surface longline fisheries in the eastern Pacific Ocean (EPO), and several species of sea turtles inhabit the grounds where these fleets operate. The endangered status of several sea turtle species, and the success of circle hooks (‘treatment’ hooks) in reducing turtle hookings in other ocean areas, as compared to J-hooks and Japanese-style tuna hooks (‘control’ hooks), prompted the initiation of a hook exchange program on the west coast of Latin America, the Eastern Pacific Regional Sea Turtle Program (EPRSTP)1. One of the goals of the EPRSTP is to determine if circle hooks would be effective at reducing turtle bycatch in artisanal fisheries of the EPO without significantly reducing the catch of marketable fish species. Participating fishers were provided with circle hooks at no cost and asked to replace the J/Japanese-style tuna hooks on their longlines with circle hooks in an alternating manner. Data collected by the EPRSTP show differences in longline gear and operational characteristics within and among countries. These aspects of the data, in addition to difficulties encountered with implementation of the alternating-hook design, pose challenges for analysis of these data.
Resumo:
DNA techniques are increasingly used as diagnostic tools in many fields and venues. In particular, a relatively new application is its use as a check for proper advertisement in markets and on restaurant menus. The identification of fish from markets and restaurants is a growing problem because economic practices often render it cost-effective to substitute one species for another. DNA sequences that are diagnostic for many commercially important fishes are now documented on public databases, such as the National Center for Biotechnology Information’s (NCBI) GenBank.1 It is now possible for most genetics laboratories to identify the species from which a tissue sample was taken without sequencing all the possible taxa it might represent.
Resumo:
Economic analysis of the trawl fishery of Brunei Darussalam was conducted using cost and returns analysis and based on an economic survey of trawlers and B:RUN, a low-level geographic information system. Profitability indicators were generated for the trawl fleet under various economic and operational scenarios. The results show that financial profits are earned by trawlers which operate off Muara, particularly those with high vessel capacity, and that these profits could be further enhanced. On the other hand, a similar fleet operating off Tutong would generate profits due mainly to high fish biomass. Trawling operations offshore are deemed financially unfeasible. Incorporating realistic opportunity costs and externalities for existing trawl operations off Muara results in economic losses.
Resumo:
The recent development of the pop-up satellite archival tag (PSAT) has allowed the collection of information on a tagged animal, such as geolocation, pressure (depth), and ambient water temperature. The success of early studies, where PSATs were used on pelagic fishes, has spurred increasing interest in the use of these tags on a large variety of species and age groups. However, some species and age groups may not be suitable candidates for carrying a PSAT because of the relatively large size of the tag and the consequent energy cost to the study animal. We examined potential energetic costs to carrying a tag for the cownose ray (Rhinoptera bonasus). Two forces act on an animal tagged with a PSAT: lift from the PSATs buoyancy and drag as the tag is moved through the water column. In a freshwater flume, a spring scale measured the total force exerted by a PSAT at flume velocities from 0.00 to 0.60 m/s. By measuring the angle of deflection of the PSAT at each velocity, we separated total force into its constituent forces — lift and drag. The power required to carry a PSAT horizontally through the water was then calculated from the drag force and velocity. Using published metabolic rates, we calculated the power for a ray of a given size to swim at a specified velocity (i.e., its swimming power). For each velocity, the power required to carry a PSAT was compared to the swimming power expressed as a percentage, %TAX (Tag Altered eXertion). A %TAX greater than 5% was felt to be energetically significant. Our analysis indicated that a ray larger than 14.8 kg can carry a PSAT without exceeding this criterion. This method of estimating swimming power can be applied to other species and would allow a researcher to decide the suitability of a given study animal for tagging with a PSAT.