16 resultados para Trade off
em Digital Commons at Florida International University
Resumo:
This dissertation discusses the relationship between inflation, currency substitution and dollarization that has taken place in Argentina for the past several decades.^ First, it is shown that when consumers are able to hold only domestic monetary balances (without capital mobility) an increase in the rate of inflation will produce a balance of payments deficit. We then look at the same issue but with heterogeneous consumers, this heterogeneity being generated by non-proportional lump-sum transfers.^ Second, we discussed some necessary assumptions related to currency substitution models and concluded that there was no a-priori conclusion on whether currencies should be assumed to be "cooperant" or "non-cooperant" in utility. That is to say, whether individuals held different currencies together or one instead of the other.^ Third, we went into discussing the issue of currency substitution as being a constraint on governments' inflationary objectives rather than a choice of those governments to avoid hyperinflations. We showed that imperfect substitutability between currencies does not "reduce the scope for rational (hyper)inflationary processes" as it had been previously argued. It will ultimately depend on the parametrization used and not on the intrinsic characteristics of imperfect substitutability between currencies.^ We further showed that in Argentina, individuals have been able to endogenize the money supply by holding foreign monetary balances. We argued that the decision to hold foreign monetary balances by individuals is always a second best due to the trade-off between holding foreign monetary balances and consumption. For some levels of income, consumption, and foreign inflation, individuals would prefer to hold domestic monetary balances rather than foreign ones.^ We then modeled the distinction between dollarization and currency substitution. We concluded that although dollarization is necessary for currency substitution to take place, the decision to use foreign monetary balances for transactions purposes is largely independent from the dollarization process.^ Finally, we concluded that Argentina should not fully dollarize its economy because dollarization is always a second best to using a domestic currency. Further, we argued that a fixed exchange system would be better than a flexible exchange rate or a "crawling-peg" system because of the characteristics of the political system and the possibilities of "mass praetorianism" to develop, which is intricately linked to "populist" solutions. ^
Resumo:
The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^
Resumo:
This research first evaluated levels and type of herbivory experienced by Centrosema virginianum plants in their native habitat and how florivory affected the pollinator activity. I found that populations of C. virginianum in two pine rockland habitat fragments experienced higher herbivory levels (15% and 22%) compared with plants in the protected study site (8.6%). I found that bees (Hymenoptera) pollinated butterfly pea. Furthermore, I found that florivores had a negative effect in the pollinators visitation rates and therefore in the seed set of the population. ^ I then conducted a study using a greenhouse population of C. virginianum. I applied artificial herbivory treatments: control, mild herbivory and severe herbivory. Flower size, pollen produced, ovules produced and seeds produced were negatively affected by herbivory. I did not find difference in nectar volume and quality by flowers among treatments. Surprisingly, severely damaged plants produced flowers with larger pollen than those from mildly damaged and undamaged plants. Results showed that plants tolerated mild and severe herbivory with 6% and 17% reduction of total fitness components, respectively. However, the investment of resources was not equisexual. ^ A comparison in the ability of siring seeds between large and small pollen was necessary to establish the biological consequence of size in pollen performance. I found that fruits produced an average of 18.7 ± 1.52 and 17.7 ± 1.50 from large and small pollen fertilization respectively. These findings supported a pollen number-size trade-off in plants under severe herbivory treatments. As far as I know, this result has not previously been reported. ^ Lastly, I tested how herbivory influenced seed abortion patterns in plants, examining how resources are allocated on different regions within fruits under artificial herbivory treatments. I found that self-fertilized fruits had greater seed abortion rates than cross-fertilized fruits. The proportion of seeds aborted was lower in the middle regions of the fruits in cross-fertilized fruits, producing more vigorous progeny. Self-fertilized fruits did not show patterns of seedling vigor. I also found that early abortion was higher closer to the peduncular end of the fruits. Position of seeds within fruits could be important in the seed dispersion mechanism characteristic of this species. ^
Resumo:
Access to healthcare is a major problem in which patients are deprived of receiving timely admission to healthcare. Poor access has resulted in significant but avoidable healthcare cost, poor quality of healthcare, and deterioration in the general public health. Advanced Access is a simple and direct approach to appointment scheduling in which the majority of a clinic's appointments slots are kept open in order to provide access for immediate or same day healthcare needs and therefore, alleviate the problem of poor access the healthcare. This research formulates a non-linear discrete stochastic mathematical model of the Advanced Access appointment scheduling policy. The model objective is to maximize the expected profit of the clinic subject to constraints on minimum access to healthcare provided. Patient behavior is characterized with probabilities for no-show, balking, and related patient choices. Structural properties of the model are analyzed to determine whether Advanced Access patient scheduling is feasible. To solve the complex combinatorial optimization problem, a heuristic that combines greedy construction algorithm and neighborhood improvement search was developed. The model and the heuristic were used to evaluate the Advanced Access patient appointment policy compared to existing policies. Trade-off between profit and access to healthcare are established, and parameter analysis of input parameters was performed. The trade-off curve is a characteristic curve and was observed to be concave. This implies that there exists an access level at which at which the clinic can be operated at optimal profit that can be realized. The results also show that, in many scenarios by switching from existing scheduling policy to Advanced Access policy clinics can improve access without any decrease in profit. Further, the success of Advanced Access policy in providing improved access and/or profit depends on the expected value of demand, variation in demand, and the ratio of demand for same day and advanced appointments. The contributions of the dissertation are a model of Advanced Access patient scheduling, a heuristic to solve the model, and the use of the model to understand the scheduling policy trade-offs which healthcare clinic managers must make. ^
Resumo:
Numerical optimization is a technique where a computer is used to explore design parameter combinations to find extremes in performance factors. In multi-objective optimization several performance factors can be optimized simultaneously. The solution to multi-objective optimization problems is not a single design, but a family of optimized designs referred to as the Pareto frontier. The Pareto frontier is a trade-off curve in the objective function space composed of solutions where performance in one objective function is traded for performance in others. A Multi-Objective Hybridized Optimizer (MOHO) was created for the purpose of solving multi-objective optimization problems by utilizing a set of constituent optimization algorithms. MOHO tracks the progress of the Pareto frontier approximation development and automatically switches amongst those constituent evolutionary optimization algorithms to speed the formation of an accurate Pareto frontier approximation. Aerodynamic shape optimization is one of the oldest applications of numerical optimization. MOHO was used to perform shape optimization on a 0.5-inch ballistic penetrator traveling at Mach number 2.5. Two objectives were simultaneously optimized: minimize aerodynamic drag and maximize penetrator volume. This problem was solved twice. The first time the problem was solved by using Modified Newton Impact Theory (MNIT) to determine the pressure drag on the penetrator. In the second solution, a Parabolized Navier-Stokes (PNS) solver that includes viscosity was used to evaluate the drag on the penetrator. The studies show the difference in the optimized penetrator shapes when viscosity is absent and present in the optimization. In modern optimization problems, objective function evaluations may require many hours on a computer cluster to perform these types of analysis. One solution is to create a response surface that models the behavior of the objective function. Once enough data about the behavior of the objective function has been collected, a response surface can be used to represent the actual objective function in the optimization process. The Hybrid Self-Organizing Response Surface Method (HYBSORSM) algorithm was developed and used to make response surfaces of objective functions. HYBSORSM was evaluated using a suite of 295 non-linear functions. These functions involve from 2 to 100 variables demonstrating robustness and accuracy of HYBSORSM.
Resumo:
The current study applies a two-state switching regression model to examine the behavior of a hypothetical portfolio of ten socially responsible (SRI) equity mutual funds during the expansion and contraction phases of US business cycles between April 1991 and June 2009, based on the Carhart four-factor model, using monthly data. The model identified a business cycle effect on the performance of SRI equity mutual funds. Fund returns were less volatile during expansion/peaks than during contraction/troughs, as indicated by the standard deviation of returns. During contraction/troughs, fund excess returns were explained by the differential in returns between small and large companies, the difference between the returns on stocks trading at high and low Book-to-Market Value, the market excess return over the risk-free rate, and fund objective. During contraction/troughs, smaller companies offered higher returns than larger companies (ci = 0.26, p = 0.01), undervalued stocks out-performed high growth stocks (h i = 0.39, p <0.0001), and funds with growth objectives out-performed funds with other objectives (oi = 0.01, p = 0.02). The hypothetical SRI portfolio was less risky than the market (bi = 0.74, p <0.0001). During expansion/peaks, fund excess returns were explained by the market excess return over the risk-free rate, and fund objective. Funds with other objectives, such as balanced funds and income funds out-performed funds with growth objectives (oi = −0.01, p = 0.03). The hypothetical SRI portfolio exhibited similar risk as the market (bi = 0.93, p <0.0001). The SRI investor adds a third criterion to the risk and return trade-off of traditional portfolio theory. This constraint is social performance. The research suggests that managers of SRI equity mutual funds may diminish value by using social and ethical criteria to select stocks, but add value by superior stock selection. The result is that the performance of SRI mutual funds is very similar to that of the market. There was no difference in the value added among secular SRI, religious SRI, and vice screens.
Resumo:
Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.
Resumo:
Infrastructure management agencies are facing multiple challenges, including aging infrastructure, reduction in capacity of existing infrastructure, and availability of limited funds. Therefore, decision makers are required to think innovatively and develop inventive ways of using available funds. Maintenance investment decisions are generally made based on physical condition only. It is important to understand that spending money on public infrastructure is synonymous with spending money on people themselves. This also requires consideration of decision parameters, in addition to physical condition, such as strategic importance, socioeconomic contribution and infrastructure utilization. Consideration of multiple decision parameters for infrastructure maintenance investments can be beneficial in case of limited funding. Given this motivation, this dissertation presents a prototype decision support framework to evaluate trade-off, among competing infrastructures, that are candidates for infrastructure maintenance, repair and rehabilitation investments. Decision parameters' performances measured through various factors are combined to determine the integrated state of an infrastructure using Multi-Attribute Utility Theory (MAUT). The integrated state, cost and benefit estimates of probable maintenance actions are utilized alongside expert opinion to develop transition probability and reward matrices for each probable maintenance action for a particular candidate infrastructure. These matrices are then used as an input to the Markov Decision Process (MDP) for the finite-stage dynamic programming model to perform project (candidate)-level analysis to determine optimized maintenance strategies based on reward maximization. The outcomes of project (candidate)-level analysis are then utilized to perform network-level analysis taking the portfolio management approach to determine a suitable portfolio under budgetary constraints. The major decision support outcomes of the prototype framework include performance trend curves, decision logic maps, and a network-level maintenance investment plan for the upcoming years. The framework has been implemented with a set of bridges considered as a network with the assistance of the Pima County DOT, AZ. It is expected that the concept of this prototype framework can help infrastructure management agencies better manage their available funds for maintenance.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
Construction projects are complex endeavors that require the involvement of different professional disciplines in order to meet various project objectives that are often conflicting. The level of complexity and the multi-objective nature of construction projects lend themselves to collaborative design and construction such as integrated project delivery (IPD), in which relevant disciplines work together during project conception, design and construction. Traditionally, the main objectives of construction projects have been to build in the least amount of time with the lowest cost possible, thus the inherent and well-established relationship between cost and time has been the focus of many studies. The importance of being able to effectively model relationships among multiple objectives in building construction has been emphasized in a wide range of research. In general, the trade-off relationship between time and cost is well understood and there is ample research on the subject. However, despite sustainable building designs, relationships between time and environmental impact, as well as cost and environmental impact, have not been fully investigated. The objectives of this research were mainly to analyze and identify relationships of time, cost, and environmental impact, in terms of CO2 emissions, at different levels of a building: material level, component level, and building level, at the pre-use phase, including manufacturing and construction, and the relationships of life cycle cost and life cycle CO2 emissions at the usage phase. Additionally, this research aimed to develop a robust simulation-based multi-objective decision-support tool, called SimulEICon, which took construction data uncertainty into account, and was capable of incorporating life cycle assessment information to the decision-making process. The findings of this research supported the trade-off relationship between time and cost at different building levels. Moreover, the time and CO2 emissions relationship presented trade-off behavior at the pre-use phase. The results of the relationship between cost and CO2 emissions were interestingly proportional at the pre-use phase. The same pattern continually presented after the construction to the usage phase. Understanding the relationships between those objectives is a key in successfully planning and designing environmentally sustainable construction projects.
Resumo:
Predators exert strong direct and indirect effects on ecological communities by intimidating their prey. Non-consumptive effects (NCEs) of predators are important features of many ecosystems and have changed the way we understand predator-prey interactions, but are not well understood in some systems. For my dissertation research I combined a variety of approaches to examine the effect of predation risk on herbivore foraging and reproductive behaviors in a coral reef ecosystem. In the first part of my dissertation, I investigated how diet and territoriality of herbivorous fish varied across multiple reefs with different levels of predator biomass in the Florida Keys National Marine Sanctuary. I show that both predator and damselfish abundance impacted diet diversity within populations for two herbivores in different ways. Additionally, reef protection and the associated recovery of large predators appeared to shape the trade-off reef herbivores made between territory size and quality. In the second part of my dissertation, I investigated context-dependent causal linkages between predation risk, herbivore foraging behavior and resource consumption in multiple field experiments. I found that reef complexity, predator hunting mode, light availability and prey hunger influenced prey perception of threat and their willingness to feed. This research argues for more emphasis on the role of predation risk in affecting individual herbivore foraging behavior in order to understand the implications of human-mediated predator removal and recovery in coral reef ecosystems.^
Resumo:
This dissertation discusses the relationship between inflation, currency substitution and dollarization that has taken place in Argentina for the past several decades. First, it is shown that when consumers are able to hold only domestic monetary balances (without capital mobility) an increase in the rate of inflation will produce a balance of payments deficit. We then look at the same issue but with heterogeneous consumers, this heterogeneity being generated by non-proportional lump-sum transfers. Second, we discussed some necessary assumptions related to currency substitution models and concluded that there was no a-priori conclusion on whether currencies should be assumed to be "cooperant" or "non-cooperant" in utility. That is to say, whether individuals held different currencies together or one instead of the other. Third, we went into discussing the issue of currency substitution as being a constraint on governments inflationary objectives rather than a choice of those governments to avoid hyperinflations. We showed that imperfect substitutability between currencies does not "reduce the scope for rational (hyper)inflationary processes" as it had been previously argued. It will ultimately depend on the parametrization used and not on the intrinsic characteristics of imperfect substitutability between currencies. We further showed that in Argentina, individuals have been able to endogenize the money supply by holding foreign monetary balances. We argued that the decision to hold foreign monetary balances by individuals is always a second best due to the trade-off between holding foreign monetary balances and consumption. For some levels of income, consumption, and foreign inflation, individuals would prefer to hold domestic monetary balances rather than foreign ones. We then modeled the distinction between dollarization and currency substitution. We concluded that although dollarization is necessary for currency substitution to take place, the decision to use foreign monetary balances for transactions purposes is largely independent from the dollarization process. Finally, we concluded that Argentina should not fully dollarize its economy because dollarization is always a second best to using a domestic currency. Further, we argued that a fixed exchange system would be better than a flexible exchange rate or a "crawling-peg" system because of the characteristics of the political system and the possibilities of "mass praetorianism" to develop, which is intricately linked to "populist" solutions.
Resumo:
Predators exert strong direct and indirect effects on ecological communities by intimidating their prey. Non-consumptive effects (NCEs) of predators are important features of many ecosystems and have changed the way we understand predator-prey interactions, but are not well understood in some systems. For my dissertation research I combined a variety of approaches to examine the effect of predation risk on herbivore foraging and reproductive behaviors in a coral reef ecosystem. In the first part of my dissertation, I investigated how diet and territoriality of herbivorous fish varied across multiple reefs with different levels of predator biomass in the Florida Keys National Marine Sanctuary. I show that both predator and damselfish abundance impacted diet diversity within populations for two herbivores in different ways. Additionally, reef protection and the associated recovery of large predators appeared to shape the trade-off reef herbivores made between territory size and quality. In the second part of my dissertation, I investigated context-dependent causal linkages between predation risk, herbivore foraging behavior and resource consumption in multiple field experiments. I found that reef complexity, predator hunting mode, light availability and prey hunger influenced prey perception of threat and their willingness to feed. This research argues for more emphasis on the role of predation risk in affecting individual herbivore foraging behavior in order to understand the implications of human-mediated predator removal and recovery in coral reef ecosystems.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
John le Carré’s novels “The Spy Who Came in From the Cold” (1963), “Tinker, Tailor, Soldier, Spy” (1974), and “The Tailor of Panama” (1997), focus on how the main characters reflect the somber reality of working in the British intelligence service. Through a broad post-structuralist analysis, I will identify the dichotomies - good/evil in “The Spy Who Came in From the Cold,” past/future in “Tinker, Tailor, Soldier, Spy,” and institution/individual in “The Tailor of Panama” - that frame the role of the protagonists. Each character is defined by his ambiguity and swinging moral compass, transforming him into a hybrid creation of morality and adaptability during transitional time periods in history, mainly during the Cold War. Le Carré’s novels reject the notion of spies standing above a group being celebrated. Instead, he portrays spies as characters who trade off individualism and social belonging for a false sense of heroism, loneliness, and even death.