6 resultados para Territorial approach on development
em Digital Commons at Florida International University
Resumo:
The Republic of South Africa since the 1948 inception of Apartheid policies has experienced economic problems resulting from spatially dispersed growth. The election of President Mandela in 1994, however, eliminated the last forms of Apartheid as well as its discriminatory spatial, social, and economic policies, specially toward black Africans. In Cape Town, South Africa, several initiatives to restructure and to economically revitalize blighted and abandoned township communities, like Langa, have been instituted. One element of this strategy is the development of activity streets. The main questions asked in this study are whether activity streets are a feasible solution to the local economic problems left by the apartheid system and whether activity streets represent an economically sustainable approach to development. An analysis of a proposed activity street in Langa and its potential to generate jobs is undertaken. An Employment Generation Model used in this study shows that many of the businesses rely on the local purchasing power of the residents. Since the economic activities are mostly service oriented, a combination of manufacturing industries and institutionally implemented strategies within the township will have to be developed in order to generate sustainable employment. The result seem to indicate that, in Langa, the activity street depend very much on an increase in sales, pedestrian and vehicular traffic flow. ^
Resumo:
This research is motivated by the need for considering lot sizing while accepting customer orders in a make-to-order (MTO) environment, in which each customer order must be delivered by its due date. Job shop is the typical operation model used in an MTO operation, where the production planner must make three concurrent decisions; they are order selection, lot size, and job schedule. These decisions are usually treated separately in the literature and are mostly led to heuristic solutions. The first phase of the study is focused on a formal definition of the problem. Mathematical programming techniques are applied to modeling this problem in terms of its objective, decision variables, and constraints. A commercial solver, CPLEX is applied to solve the resulting mixed-integer linear programming model with small instances to validate the mathematical formulation. The computational result shows it is not practical for solving problems of industrial size, using a commercial solver. The second phase of this study is focused on development of an effective solution approach to this problem of large scale. The proposed solution approach is an iterative process involving three sequential decision steps of order selection, lot sizing, and lot scheduling. A range of simple sequencing rules are identified for each of the three subproblems. Using computer simulation as the tool, an experiment is designed to evaluate their performance against a set of system parameters. For order selection, the proposed weighted most profit rule performs the best. The shifting bottleneck and the earliest operation finish time both are the best scheduling rules. For lot sizing, the proposed minimum cost increase heuristic, based on the Dixon-Silver method performs the best, when the demand-to-capacity ratio at the bottleneck machine is high. The proposed minimum cost heuristic, based on the Wagner-Whitin algorithm is the best lot-sizing heuristic for shops of a low demand-to-capacity ratio. The proposed heuristic is applied to an industrial case to further evaluate its performance. The result shows it can improve an average of total profit by 16.62%. This research contributes to the production planning research community with a complete mathematical definition of the problem and an effective solution approach to solving the problem of industry scale.
Resumo:
The primary purpose of these studies was to determine the effect of planning menus using the Institute of Medicine's (IOMs) Simple Nutrient Density Approach on nutrient intakes of long-term care (LTC) residents. In the first study, nutrient intakes of 72 subjects were assessed using Dietary Reference Intakes (DRIs) and IOM methodology. The intake distributions were used to set intake and menu planning goals. In the second study, the facility's regular menus were modified to meet the intake goals for vitamin E, magnesium, zinc, vitamin D and calcium. An experiment was used to test whether the modified menu resulted in intakes of micronutrients sufficient to achieve a low prevalence (<3%) of nutrient inadequacies. Three-day weighed food intakes for 35 females were adjusted for day-to-day variations in order to obtain an estimate of long-term average intake and to estimate the proportion of residents with inadequate nutrient intakes. ^ In the first study, the prevalence of inadequate intakes was determined to be between 65-99% for magnesium, vitamin E, and zinc. Mean usual intakes of Vitamin D and calcium were far below the Adequate Intakes (AIs). In the experimental study, the prevalence of inadequacies was reduced to <3% for zinc and vitamin E but not magnesium. The groups' mean usual intake from the modified menu met or exceeded the AI for calcium but fell short for vitamin D. Alternatively, it was determined that addition of a multivitamin and mineral (MVM) supplement to intakes of the regular menu could be used to achieve goals for vitamin E, zinc and vitamin D but not calcium and magnesium. ^ A combination of menu modification and MVM supplementation may be necessary to achieve a low prevalence of micronutrient inadequacies among LTC residents. Menus should be planned to optimize intakes of those nutrients that are low in an MVM, such as calcium, magnesium, and potassium. A MVM supplement should be provided to fill the gap for nutrients not provided in sufficient amounts by the diet, such as vitamin E and vitamin D. ^
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
While analysis of the effect which education and migration have on development is neither clear cut, nor obvious, regimes such as those of Jamaica have traditionally placed great emphasis on development through education at all levels. The process of human resource development and the accumulation of human capital is intended to unlock the door to modernization. Nevertheless, our findings indicate a considerable loss of professional and skilled personnel -- the same group that embody a disproportionate amount of educational expenditure relative to the population. Insofar as planning is concerned this migration represents a negative factor. The developing country of Jamaica is unintentionally supplying the developed world with an "annual gift" of human capital which its economy cannot afford. The major issue becomes: to what extent can any government "protect" its investments by restricting movements of capital and people. The general assumption of this paper is that the question of human rights cannot be ignored especially in democracies (which Jamaica decidedly is), where movement is seen as an ingrained human right. During the 1970s and 1980s, Jamaica and the Caribbean as a whole has lost much through intellectual capital migrations. Yet brains may also die in their own environment, if deprived the ability to create their own criteria and goals. Forcing people to stay with their money and know-how may only serve to produce and economic environment overgrown with weeds of lethargy, indolence and mediocrity.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.