889 resultados para Value analysis (Cost control)


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Contributed to: "Measuring the Changes": 13th FIG International Symposium on Deformation Measurements and Analysis; 4th IAG Symposium on Geodesy for Geotechnical and Structural Enginering (Lisbon, Portugal, May 12-15, 2008).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A sliding mode position control for high-performance real-time applications of induction motors in developed in this work. The design also incorporates a simple flux estimator in order to avoid the flux sensors. Then, the proposed control scheme presents a low computational cost and therefore can be implemented easily in a real-time applications using a low cost DSP-processor. The stability analysis of the controller under parameter uncertainties and load disturbances in provided using Lyapunov stability theory. Finally, simulated and experimental results show that the proposed controller with the proposed observer provides a good trajectory tracking and that this scheme is robust with respect to plant parameter variations and external load disturbances.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The study was designed to investigate the value of the peels of yam (Dioscorea rotundata) as energy source in the diet of Oreochromis niloticus fry and to investigate the level of inclusion of this peels that will give optimum growth performance. Four diets, three levels of yam peels and a control, was prepared and tested on O. niloticus fry (mean weight of 0.27g) for ten weeks. Fifteen (15) O. niloticus fry were grouped in each of the glass aquaria, measuring 60x30x3Ocm and with a maximum capacity of 52 liters of water. The fry were fed twice daily at 10% biomass. The fry were weighed weekly to determine weight increment or otherwise and the quality of feed was adjusted accordingly. DTl (70% yam peels and 30% yellow maize) in the carbohydrate mixture gave the best performance. The fry fed this diet, gained a mean weight of 1.20g for the period of the experiment. The poorest performance in terms of growth was from fry fed the control diet (100% yellow maize in the carbohydrate mixture). Fry fed this diet gained mean weight of 0.80g for the duration of the experiment. Analysis of the various growth indices like SGR, PER, FCR and NPU shows that DTl was the overall best diet with an SGR value of I. 92 and FCR of 54.10. The difference in weight gain by fry fed the three levels of yam peels diet and the control diet (100% yellow maize) was not statistically significant (P>0.05)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

How is climate change affecting our coastal environment? How can coastal communities adapt to sea level rise and increased storm risk? These questions have garnered tremendous interest from scientists and policy makers alike, as the dynamic coastal environment is particularly vulnerable to the impacts of climate change. Over half the world population lives and works in a coastal zone less than 120 miles wide, thereby being continuously affected by the changes in the coastal environment [6]. Housing markets are directly influenced by the physical processes that govern coastal systems. Beach towns like Oak Island in North Carolina (NC) face severe erosion, and the tax assesed value of one coastal property fell by 93% in 2007 [9]. With almost ninety percent of the sandy beaches in the US facing moderate to severe erosion [8], coastal communities often intervene to stabilize the shoreline and hold back the sea in order to protect coastal property and infrastructure. Beach nourishment, which is the process of rebuilding a beach by periodically replacing an eroding section of the beach with sand dredged from another location, is a policy for erosion control in many parts of the US Atlantic and Pacific coasts [3]. Beach nourishment projects in the United States are primarily federally funded and implemented by the Army Corps of Engineers (ACE) after a benefit-cost analysis. Benefits from beach nourishment include reduction in storm damage and recreational benefits from a wider beach. Costs would include the expected cost of construction, present value of periodic maintenance, and any external cost such as the environmental cost associated with a nourishment project (NOAA). Federal appropriations for nourishment totaled $787 million from 1995 to 2002 [10]. Human interventions to stabilize shorelines and physical coastal dynamics are strongly coupled. The value of the beach, in the form of storm protection and recreation amenities, is at least partly capitalized into property values. These beach values ultimately influence the benefit-cost analysis in support of shoreline stabilization policy, which, in turn, affects the shoreline dynamics. This paper explores the policy implications of this circularity. With a better understanding of the physical-economic feedbacks, policy makers can more effectively design climate change adaptation strategies. (PDF contains 4 pages)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The control role of the relative phase between the probe and driving fields on the gain and dispersion in an open Lambda-type inversionless lasing system with spontaneously generated coherence (SGC) is investigated. It is shown that the inversionless gain and dispersion are quite sensitive to variation in the relative phase; by adjusting the value of the relative phase, electromagnetically induced transparency (EIT), a high refractive index with zero absorption and a larger inversionless gain can be realized. It is also shown that, in the contributions to the inversionless gain ( absorption) and dispersion, the contribution from SGC is always much larger than that from the dynamically induced coherence for any value of the relative phase. Our analysis shows that variation in the SGC effect will cause the spectrum regions and values of the inversionless gain and dispersion to vary evidently. We also found that, under the same conditions, the values of the inversionless gain and dispersion in the open system are evidently larger than those in the corresponding closed system; EIT occurs in the open system but cannot occur in the closed system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications.

Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake.

To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that can capture the uncertainties in EEW information and the decision process is used. This approach is called the Performance-Based Earthquake Early Warning, which is based on the PEER Performance-Based Earthquake Engineering method. Use of surrogate models is suggested to improve computational efficiency. Also, new models are proposed to add the influence of lead time into the cost-benefit analysis. For example, a value of information model is used to quantify the potential value of delaying the activation of a mitigation action for a possible reduction of the uncertainty of EEW information in the next update. Two practical examples, evacuation alert and elevator control, are studied to illustrate the ePAD framework. Potential advanced EEW applications, such as the case of multiple-action decisions and the synergy of EEW and structural health monitoring systems, are also discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study examined the sea cucumber industry in the Philippines through the value chain lens. The intent was to identify effective pathways for the successful introduction of sandfish culture as livelihood support for coastal communities. Value chain analysis is a high-resolution analytical tool that enables industry examination at a detailed level. Previous industry assessments have provided a general picture of the sea cucumber industry in the country. The present study builds on the earlier work and supplies additional details for a better understanding of the industry's status and problems, especially their implications for the Australian Center for International Agricultural Research (ACIAR) funded sandfish project "Culture of sandfish (Holothuria scabra) in Asia- Pacific" (FIS/2003/059). (PDF contains 54 pages)