895 resultados para Cost of merchandising


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The study examined the sustainability of various indigenous technologies in post-harvest fishery operation in Edo and Delta States (Nigeria). A total of seventy processors were interviewed during the survey through a random selection. The data obtained were analysed by descriptive statistics. The results obtained revealed that the majority of the fish processors within the study areas were married with women who were not educated beyond the first Leaving School Certificate. Most of the fish processed were bought fresh, while the commonest method of preservation/processing practiced was smoking. The type of processing equipment used was the Chorkor smoking kiln and the drum smoker while the commonest source of energy is firewood. The processing activities within the communities were found to be profitable. However it was observed that due to the high cost of processing materials and equipment, the economic growth and the living standard is quite low. Some recommendations were made to improve the traditional method of fish preservation and processing

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fish smoking, as a traditional occupation of fishermen and women in Kainji Lake Area (Nigeria) is done using simple traditional ovens called 'Banda', the fuel for the smoking being almost hundred percent dependent on wood. A simple modification was made to the traditional 'Banda' oven using a damper to prevent burning of the fish. A comparison of the improved and the traditional 'Banda' was made. The results indicate that fuel wood consumption was reduced 52 percent by using the improved 'Banda', which implied that 50 percent of fish processor's income could be saved through the adoption of this technology. The most important advantage of the improved kiln, fuel wood conservation, represents for fishers a problem of an economic importance. Whilst they are aware that it is becoming much more difficult to get the needed fuel wood, the children can still conveniently collect enough wood for both home use and processing activities. The cost of the components of the improved kiln, when compared with the traditional version may be considered quite significant, and hence the reluctance of the fish processors in constructing similar ones. Selected blacksmiths were trained to continue the fabrication of the kiln component. The training was carried out to assure that the improved kiln will be constructed even after the project will end to support the fabrication

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The calculation of settling speed of coarse particles is firstly addressed, with accelerated Stokesian dynamics without adjustable parameters, in which far field force acting on the particle instead of particle velocity is chosen as dependent variables to consider inter-particle hydrodynamic interactions. The sedimentation of a simple cubic array of spherical particles is simulated and compared to the results available to verify and validate the numerical code and computational scheme. The improvedmethod keeps the same computational cost of the order O(N log N) as usual accelerated Stokesian dynamics does. Then, more realistic random suspension sedimentation is investigated with the help ofMont Carlo method. The computational results agree well with experimental fitting. Finally, the sedimentation of finer cohesive particle, which is often observed in estuary environment, is presented as a further application in coastal engineering.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite an increasing literary focus on climate change adaptation, the facilitation of this adaptation is occurring on a limited basis (Adger et al. 2007) .This limited basis is not necessarily due to inability; rather, a lack of comprehensive cost estimates of all options specifically hinders adaptation in vulnerable communities (Adger et al. 2007). Specifically the estimated cost of the climate change impact of sea-level rise is continually increasing due to both increasing rates and the resulting multiplicative impact of coastal erosion (Karl et al., 2009, Zhang et al., 2004) Based on the 2007 Intergovernmental Panel on Climate Change report, minority groups and small island nations have been identified within these vulnerable communities. Therefore the development of adaptation policies requires the engagement of these communities. State examples of sea-level rise adaptation through land use planning mechanisms such as land acquisition programs (New Jersey) and the establishment of rolling easements (Texas) are evidence that although obscured, adaptation opportunities are being acted upon (Easterling et al., 2004, Adger et al.2007). (PDF contains 4 pages)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The World Food Summit in its meeting in Rome in 1999 estimated that 790 million people in the developing world do not have enough food to eat. This is more than the total populations of North America and Europe combined. Nigeria is one of the developing countries affected by hunger, deprivation and abject poverty by its citizenry inspite of its enormous natural and human resources. To reduce poverty and increase food supplies to the masses the Federal Government of Nigeria embarked on a programmed-tagged National Special Programme for Food Security (NSPFS) in the year 2002. The programme's broad objectives are to attain food security in the broadest sense and alleviate rural poverty in Nigeria. One of the areas of the programme's intervention is in the aquaculture and inland fisheries development because Nigeria imported 681mt of fish in 2003 with a total cost of about N50 million. The paper assesses the socio-economic conditions of one of the selected water bodies (Yamama Lake) with a view to introducing community-based fisheries management plan for the rational exploitation and management of the fishery and other aquatic resources of the water body thereby increasing fish supply and improving the living standard of the fisherfolk in the area. Data were collected using Participatory Rural Appraisal (PRA) tools and questionnaire administration

Relevância:

90.00% 90.00%

Publicador:

Resumo:

89 ripe female brooders of the catfish, Clarias anguillaris (Body wt. Range 150g-1, 200g) were induced to spawn by hormone (Ovaprim) induced natural spawning technique over a period of 10 weeks. Matching ripe males were used for pairing the females at the ratio of two males to a female. Six ranges of brood stock body weights were considered as follows; <200g; 200g-399g; 400g-599g; 600-799g; 800g-999g; > 1000g and the number of fry produced by each female brooder was scored/recorded against the corresponding body weight range. The number of fry per unit quantity of hormone and the cost of production a fry based on the current price of Ovaprim (hormon) were determined so as to ascertain most economic size range. The best and most economic size range was between 400g-599g body weight with about 20,000 fry per ml of hormone and N0.028 per fry, while the females above 1000g gave the poorest results of 9,519 fry per ml of hormone and N0.059 per fry. For optimum production of Clarias anguillaris fry and maximum return on investment female brooders of body weights ranging between 400g-599g are recommended for hormone induced natural breeding exercises

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The study focused on men and women involved in artisanal fisheries in some selected areas of Ikorodu Local government in Lagos State. The random sampling technique was used to select 50 fishermen each at Ibeshe and Baiyeku sites. The results revealed that majority of the fishermen were male, christian, semi-illiterate, and married. Data were collected on capital sources, labour used, income, gear techniques and type of fish caught. Analysis showed that the highest sources of capital were from personal savings (50%). Majority of labour used were hired labour, 44% at Ibeshe and 50% at Baiyeku. Highest monthly income ranged between N10, 000 - N25, 000 at both sites. Planks were mostly used at both sites for fishing boats as well as means of transport (Ibeshe 68%, Baiyeku 72%). Common fishing gear was the gill net, The fishes caught were found to be of various tyupes. Ethalmalosa fimbriata constituted the highest fish species caught by weight and number at both sites (50%). However, the problems of capital source were most peculiar coupled with high cost of fishing materials and labour scarcity

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study examined the economic potential of fish farming in Abeokuta zone of Ogun State in the 2003 production season. Descriptive statistics cost returns and multiple regression analysis were used in analyzing the data. The farmers predominantly practiced monoculture. Inefficiency in the use of pond size, lime and labour with over-utilization of fingerlings stocked was revealed by the study. The average variable cost of N124.67 constituted 45% of the total while average fixed cost was N149.802.67 per average farm size. Fish farming was found to be a profitable venture in the study area with a net income of N761, 400.58 for an average pond size of 301.47sq.m. Based on these findings, it is suggested that for profit maximization, the fish farm will have to increase the level of their use of fingerlings and fertilizers and decrease the use of lime labour and pond size

Relevância:

90.00% 90.00%

Publicador:

Resumo:

89 ripe female brooders of the catfish, Clarias anguillaris (Body wt. Range 150g-1, 200g) were induced to spawn by hormone (Ovaprim) induced natural spawning technique over a period of 10 weeks. Matching ripe males were used for pairing the females at the ratio of two males to a female. Six ranges of brood stock body weights were considered as follows; <200g; 200g-399g; 400g-599g; 600-799g; 800g-999g; > 1000g and the number of fry produced by each female brooder was scored/recorded against the corresponding body weight range. The number of fry per unit quantity of hormone and the cost of production a fry based on the current price of Ovaprim (hormon) were determined so as to ascertain most economic size range. The best and most economic size range was between 400g-599g body weight with about 20,000 fry per ml of hormone and N0.028 per fry, while the females above 1000g gave the poorest results of 9,519 fry per ml of hormone and N0.059 per fry. For optimum production of Clarias anguillaris fry and maximum return on investment female brooders of body weights ranging between 400g-599g are recommended for hormone induced natural breeding exercises

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The epidemic of HIV/AIDS in the United States is constantly changing and evolving, starting from patient zero to now an estimated 650,000 to 900,000 Americans infected. The nature and course of HIV changed dramatically with the introduction of antiretrovirals. This discourse examines many different facets of HIV from the beginning where there wasn't any treatment for HIV until the present era of highly active antiretroviral therapy (HAART). By utilizing statistical analysis of clinical data, this paper examines where we were, where we are and projections as to where treatment of HIV/AIDS is headed.

Chapter Two describes the datasets that were used for the analyses. The primary database utilized was collected by myself from an outpatient HIV clinic. The data included dates from 1984 until the present. The second database was from the Multicenter AIDS Cohort Study (MACS) public dataset. The data from the MACS cover the time between 1984 and October 1992. Comparisons are made between both datasets.

Chapter Three discusses where we were. Before the first anti-HIV drugs (called antiretrovirals) were approved, there was no treatment to slow the progression of HIV. The first generation of antiretrovirals, reverse transcriptase inhibitors such as AZT (zidovudine), DDI (didanosine), DDC (zalcitabine), and D4T (stavudine) provided the first treatment for HIV. The first clinical trials showed that these antiretrovirals had a significant impact on increasing patient survival. The trials also showed that patients on these drugs had increased CD4+ T cell counts. Chapter Three examines the distributions of CD4 T cell counts. The results show that the estimated distributions of CD4 T cell counts are distinctly non-Gaussian. Thus distributional assumptions regarding CD4 T cell counts must be taken, into account when performing analyses with this marker. The results also show the estimated CD4 T cell distributions for each disease stage: asymptomatic, symptomatic and AIDS are non-Gaussian. Interestingly, the distribution of CD4 T cell counts for the asymptomatic period is significantly below that of the CD4 T cell distribution for the uninfected population suggesting that even in patients with no outward symptoms of HIV infection, there exists high levels of immunosuppression.

Chapter Four discusses where we are at present. HIV quickly grew resistant to reverse transcriptase inhibitors which were given sequentially as mono or dual therapy. As resistance grew, the positive effects of the reverse transcriptase inhibitors on CD4 T cell counts and survival dissipated. As the old era faded a new era characterized by a new class of drugs and new technology changed the way that we treat HIV-infected patients. Viral load assays were able to quantify the levels of HIV RNA in the blood. By quantifying the viral load, one now had a faster, more direct way to test antiretroviral regimen efficacy. Protease inhibitors, which attacked a different region of HIV than reverse transcriptase inhibitors, when used in combination with other antiretroviral agents were found to dramatically and significantly reduce the HIV RNA levels in the blood. Patients also experienced significant increases in CD4 T cell counts. For the first time in the epidemic, there was hope. It was hypothesized that with HAART, viral levels could be kept so low that the immune system as measured by CD4 T cell counts would be able to recover. If these viral levels could be kept low enough, it would be possible for the immune system to eradicate the virus. The hypothesis of immune reconstitution, that is bringing CD4 T cell counts up to levels seen in uninfected patients, is tested in Chapter Four. It was found that for these patients, there was not enough of a CD4 T cell increase to be consistent with the hypothesis of immune reconstitution.

In Chapter Five, the effectiveness of long-term HAART is analyzed. Survival analysis was conducted on 213 patients on long-term HAART. The primary endpoint was presence of an AIDS defining illness. A high level of clinical failure, or progression to an endpoint, was found.

Chapter Six yields insights into where we are going. New technology such as viral genotypic testing, that looks at the genetic structure of HIV and determines where mutations have occurred, has shown that HIV is capable of producing resistance mutations that confer multiple drug resistance. This section looks at resistance issues and speculates, ceterus parabis, where the state of HIV is going. This section first addresses viral genotype and the correlates of viral load and disease progression. A second analysis looks at patients who have failed their primary attempts at HAART and subsequent salvage therapy. It was found that salvage regimens, efforts to control viral replication through the administration of different combinations of antiretrovirals, were not effective in 90 percent of the population in controlling viral replication. Thus, primary attempts at therapy offer the best change of viral suppression and delay of disease progression. Documentation of transmission of drug-resistant virus suggests that the public health crisis of HIV is far from over. Drug resistant HIV can sustain the epidemic and hamper our efforts to treat HIV infection. The data presented suggest that the decrease in the morbidity and mortality due to HIV/AIDS is transient. Deaths due to HIV will increase and public health officials must prepare for this eventuality unless new treatments become available. These results also underscore the importance of the vaccine effort.

The final chapter looks at the economic issues related to HIV. The direct and indirect costs of treating HIV/AIDS are very high. For the first time in the epidemic, there exists treatment that can actually slow disease progression. The direct costs for HAART are estimated. It is estimated that the direct lifetime costs for treating each HIV infected patient with HAART is between $353,000 to $598,000 depending on how long HAART prolongs life. If one looks at the incremental cost per year of life saved it is only $101,000. This is comparable with the incremental costs per year of life saved from coronary artery bypass surgery.

Policy makers need to be aware that although HAART can delay disease progression, it is not a cure and HIV is not over. The results presented here suggest that the decreases in the morbidity and mortality due to HIV are transient. Policymakers need to be prepared for the eventual increase in AIDS incidence and mortality. Costs associated with HIV/AIDS are also projected to increase. The cost savings seen recently have been from the dramatic decreases in the incidence of AIDS defining opportunistic infections. As patients who have been on HAART the longest start to progress to AIDS, policymakers and insurance companies will find that the cost of treating HIV/AIDS will increase.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article outlines the outcome of work that set out to provide one of the specified integral contributions to the overarching objectives of the EU- sponsored LIFE98 project described in this volume. Among others, these included a requirement to marry automatic monitoring and dynamic modelling approaches in the interests of securing better management of water quality in lakes and reservoirs. The particular task given to us was to devise the elements of an active management strategy for the Queen Elizabeth II Reservoir. This is one of the larger reservoirs supplying the population of the London area: after purification and disinfection, its water goes directly to the distribution network and to the consumers. The quality of the water in the reservoir is of primary concern, for the greater is the content of biogenic materials, including phytoplankton, then the more prolonged is the purification and the more expensive is the treatment. Whatever good that phytoplankton may do by way of oxygenation and oxidative purification, it is eventually relegated to an impurity that has to be removed from the final product. Indeed, it has been estimated that the cost of removing algae and microorganisms from water represents about one quarter of its price at the tap. In chemically fertile waters, such as those typifying the resources of the Thames Valley, there is thus a powerful and ongoing incentive to be able to minimise plankton growth in storage reservoirs. Indeed, the Thames Water company and its predecessor undertakings, have a long and impressive history of confronting and quantifying the fundamentals of phytoplankton growth in their reservoirs and of developing strategies for operation and design to combat them. The work to be described here follows in this tradition. However, the use of the model PROTECH-D to investigate present phytoplankton growth patterns in the Queen Elizabeth II Reservoir questioned the interpretation of some of the recent observations. On the other hand, it has reinforced the theories underpinning the original design of this and those Thames-Valley storage reservoirs constructed subsequently. The authors recount these experiences as an example of how simulation models can hone the theoretical base and its application to the practical problems of supplying water of good quality at economic cost, before the engineering is initiated.