715 resultados para updates
Resumo:
This paper extends the recently developed multiplexed model predictive control (MMPC) concept to ensure satisfaction of hard constraints despite the action of persistent, unknown but bounded disturbances. MMPC uses asynchronous control moves on each input channel instead of synchronised moves on all channels. It offers reduced computation, by dividing the online optimisation into a smaller problem for each channel, and potential performance improvements, as the response to a disturbance is quicker, albeit via only one channel. Robustness to disturbances is introduced using the constraint tightening approach, tailored to suit the asynchronous updates of MMPC and the resulting time-varying optimisations. Numerical results are presented, involving a simple mechanical example and an aircraft control example, showing the potential computational and performance benefits of the new robust MMPC.
Resumo:
11 p.
Resumo:
This report contains inorganic nutrient chemistry, sulfide and oxygen data collected during cruises 2 through 5 of the 1988 Black Sea Oceanographic Expedition aboard the R/V Knorr. Continuous nutrient and sulfide data were obtained in the upper 375 m using a pumped profiling system. Discrete samples were collected from rosette-CTD casts. The corresponding physical oceanographic data have been presented by White et al. (1989). Although all of the data reported has been edited at least twice, errors may remain. We encourage queries and plan to distribute updates on electronic media if there are any non-trivial changes.
Resumo:
Report: COFI Session-Securing small-scale fisheries; Statement-Contributing Significantly; Somalia: Pirate Fishing -Pirates or Saviours of the Coast?; Marine Protected Areas-Managing to Benefit; Mexico: Marine Reserves--Caught Up in Change; MPAs-Importance of Social Capital; MSC Ecolabels-Work Together for Community-based Fisheries; Netherlands: Inland Fisheries -A Management Fantasy?;Small Indigenous Species -Small is Nutritional; ICSF Resources- Information Updates
Resumo:
An assessment of the status of the Atlantic stock of red drum is conducted using recreational and commercial data from 1986 through 1998. This assessment updates data and analyses from the 1989, 1991, 1992 and 1995 stock assessments on Atlantic coast red drum (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996). Since 1981, coastwide recreational catches ranged between 762,300 pounds in 1980 and 2,623,900 pounds in 1984, while commercial landings ranged between 60,900 pounds in 1997 and 422,500 pounds in 1984. In weight of fish caught, Atlantic red drum constitute predominantly a recreational fishery (ranging between 85 and 95% during the 1990s). Commercially, red drum continue to be harvested as part of mixed species fisheries. Using available length-frequency distributions and age-length keys, recreational and commercial catches are converted to catch in numbers at age. Separable and tuned virtual population analyses are conducted on the catch in numbers at age to obtain estimates of fishing mortality rates and population size (including recruitment to age 1). In tum, these estimates of fishing mortality rates combined with estimates of growth (length and weight), sex ratios, sexual maturity and fecundity are used to estimate yield per recruit, escapement to age 4, and static (or equilibrium) spawning potential ratio (static SPR, based on both female biomass and egg production). Three virtual analysis approaches (separable, spreadsheet, and FADAPT) were applied to catch matrices for two time periods (early: 1986-1991, and late: 1992-1998) and two regions (Northern: North Carolina and north, and Southern: South Carolina through east coast of Florida). Additional catch matrices were developed based on different treatments for the catch-and-release recreationally-caught red drum (B2-type). These approaches included assuming 0% mortality (BASEO) versus 10% mortality for B2 fish. For the 10% mortality on B2 fish, sizes were assumed the same as caught fish (BASEl), or positive difference in size distribution between the early period and the later period (DELTA), or intermediate (PROP). Hence, a total of 8 catch matrices were developed (2 regions, and 4 B2 assumptions for 1986-1998) to which the three VPA approaches were applied. The question of when offshore emigration or reduced availability begins (during or after age 3) continues to be a source of bias that tends to result in overestimates of fishing mortality. Additionally, the continued assumption (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996) of no fishing mortality on adults (ages 6 and older), causes a bias that results in underestimates of fishing mortality for adult ages (0 versus some positive value). Because of emigration and the effect of the slot limit for the later period, a range in relative exploitations of age 3 to age 2 red drum was considered. Tuning indices were developed from the MRFSS, and state indices for use in the spreadsheet and FADAPT VPAs. The SAFMC Red Drum Assessment Group (Appendix A) favored the FADAPT approach with catch matrix based on DELTA and a selectivity for age 3 relative to age 2 of 0.70 for the northern region and 0.87 for the southern region. In the northern region, estimates of static SPR increased from about 1.3% for the period 1987-1991 to approximately 18% (15% and 20%) for the period 1992-1998. For the southern region, estimates of static SPR increased from about 0.5% for the period 1988-1991 to approximately 15% for the period 1992-1998. Population models used in this assessment (specifically yield per recruit and static spawning potential ratio) are based on equilibrium assumptions: because no direct estimates are available as to the current status of the adult stock, model results imply potential longer term, equilibrium effects. Because current status of the adult stock is unknown, a specific rebuilding schedule cannot be determined. However, the duration of a rebuilding schedule should reflect, in part, a measure of the generation time of the fish species under consideration. For a long-lived, but relatively early spawning, species as red drum, mean generation time would be on the order of 15 to 20 years based on age-specific egg production. Maximum age is 50 to 60 years for the northern region, and about 40 years for the southern region. The ASMFC Red Drum Board's first phase recovery goal of increasing %SPR to at least 10% appears to have been met. (PDF contains 79 pages)
Resumo:
Birkenhead Sixth Form College implemented a virtual network to open up remote access to the college network for its students, staff and governors. In particular, for childcare students on work placements, this has meant 24/7 secure access to their work and resources, and the ability to make timely updates to their work evidence logs. The impact is better continuity of learning and a dramatic increase in the hand-in rate for work. For the staff, governors and college as a whole, the benefits of anytime-access to the network are more than were envisaged at the outset; not only is it saving them valuable time and eliminating the need for large print runs, it is expected to bring cost-savings to the College in the long term.
Resumo:
HIGHLIGHTS FOR FY 2004 1. Completed the second of a 3-year Gulf sturgeon population estimate on the Escambia River, Florida. 2. Completed the first of a 2-year Gulf sturgeon population estimate on the Apalachicola River, Florida. 3. Conducted Gulf sturgeon presence-absence surveys in three other Florida river systems. 4. Documented Gulf sturgeon marine habitat use in the near shore waters of the Gulf of Mexico. 5. Identified environmental threats to Gulf sturgeon spawning habitat in the Choctawhatchee River, Florida. 6. Initiated a study to document Gulf sturgeon spawning with the collection of fertilized eggs in the Yellow River, Florida. 7. Implemented Gulf Striped Bass Restoration Plan by coordinating the 21st Annual Morone Workshop, leading the technical committee, transporting broodfish, and coordinating the stocking on the Apalachicola-Chattahoochee-Flint (ACF) river system. 8. Over 86,000 Phase II Gulf striped bass were marked with sequential coded wire tags and stocked in Lake Seminole and the Apalachicola River. Post-stocking evaluations were conducted at 31 sites. 9. Drafted updates to Apalachicola-Chattahoochee-Flint Striped Bass Restoration and Evaluation Five-Year Plan with partners. 10. Fishery surveys were conducted on Tyndall Air Force Base and St. Marks and St. Vincent National Wildlife Refuges. 11. Habitat evaluations and population surveys were completed at 153 Okaloosa darter stream sites. 12. Aquatic insect biomonitoring and identification of over 39,000 individual aquatic macroinvertebrates was completed and provided to Eglin Air Force Base. 13. Ten years of fishery data from Okefenokee and Banks Lake National Wildlife Refuges was analyzed with recommendations incorporated into the refuge Comprehensive Conservation Plan. 14. A draft mussel sampling protocol was tested in wadeable streams in northwest Florida and southwest Georgia. 15. Implemented recovery plan and candidate conservation actions for 14 listed and candidate freshwater mussels in the Northeast Gulf Watersheds. 16. Worked with partners in developing the Spring Creek Watershed Partnership in the Flint River basin, Georgia. 17. Multiple stream restoration and watershed management projects were initiated or completed. A total of 6.8 stream miles were restored for stream fishes, along with 56.4 miles of coastline were enhanced for sea turtle lighting. A total of 135 acres of wetlands and 58 acres of understory habitat were restored. 18. Multiple outreach projects were completed to detail aquatic resources conservation needs and opportunities. Participated in National Fishing Week event, BASS ProShops event, several festivals, and school outreach.
Resumo:
We investigate the 2d O(3) model with the standard action by Monte Carlo simulation at couplings β up to 2.05. We measure the energy density, mass gap and susceptibility of the model, and gather high statistics on lattices of size L ≤ 1024 using the Floating Point Systems T-series vector hypercube and the Thinking Machines Corp.'s Connection Machine 2. Asymptotic scaling does not appear to set in for this action, even at β = 2.10, where the correlation length is 420. We observe a 20% difference between our estimate m/Λ^─_(Ms) = 3.52(6) at this β and the recent exact analytical result . We use the overrelaxation algorithm interleaved with Metropolis updates and show that decorrelation time scales with the correlation length and the number of overrelaxation steps per sweep. We determine its effective dynamical critical exponent to be z' = 1.079(10); thus critical slowing down is reduced significantly for this local algorithm that is vectorizable and parallelizable.
We also use the cluster Monte Carlo algorithms, which are non-local Monte Carlo update schemes which can greatly increase the efficiency of computer simulations of spin models. The major computational task in these algorithms is connected component labeling, to identify clusters of connected sites on a lattice. We have devised some new SIMD component labeling algorithms, and implemented them on the Connection Machine. We investigate their performance when applied to the cluster update of the two dimensional Ising spin model.
Finally we use a Monte Carlo Renormalization Group method to directly measure the couplings of block Hamiltonians at different blocking levels. For the usual averaging block transformation we confirm the renormalized trajectory (RT) observed by Okawa. For another improved probabilistic block transformation we find the RT, showing that it is much closer to the Standard Action. We then use this block transformation to obtain the discrete β-function of the model which we compare to the perturbative result. We do not see convergence, except when using a rescaled coupling β_E to effectively resum the series. For the latter case we see agreement for m/ Λ^─_(Ms) at , β = 2.14, 2.26, 2.38 and 2.50. To three loops m/Λ^─_(Ms) = 3.047(35) at β = 2.50, which is very close to the exact value m/ Λ^─_(Ms) = 2.943. Our last point at β = 2.62 disagrees with this estimate however.
Resumo:
In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.
We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.
We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.
In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.
In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.
We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.
In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.
Resumo:
This brief article provides some of the background to the publication of The Freshwater Algal Flora of the British Isles (John et al. 2002). This publication updates West and Fritsch's A Treatise on British Freshwater Algae (2nd edition) which was published in 1927. Taxonomic experts on the major freshwater algal groups were approached and almost all agreed to collaborate. The book which has taken more than 10 years to complete is illustrated with over 2000 line drawings and there are 20 half tone plates.
Resumo:
This thesis studies decision making under uncertainty and how economic agents respond to information. The classic model of subjective expected utility and Bayesian updating is often at odds with empirical and experimental results; people exhibit systematic biases in information processing and often exhibit aversion to ambiguity. The aim of this work is to develop simple models that capture observed biases and study their economic implications.
In the first chapter I present an axiomatic model of cognitive dissonance, in which an agent's response to information explicitly depends upon past actions. I introduce novel behavioral axioms and derive a representation in which beliefs are directionally updated. The agent twists the information and overweights states in which his past actions provide a higher payoff. I then characterize two special cases of the representation. In the first case, the agent distorts the likelihood ratio of two states by a function of the utility values of the previous action in those states. In the second case, the agent's posterior beliefs are a convex combination of the Bayesian belief and the one which maximizes the conditional value of the previous action. Within the second case a unique parameter captures the agent's sensitivity to dissonance, and I characterize a way to compare sensitivity to dissonance between individuals. Lastly, I develop several simple applications and show that cognitive dissonance contributes to the equity premium and price volatility, asymmetric reaction to news, and belief polarization.
The second chapter characterizes a decision maker with sticky beliefs. That is, a decision maker who does not update enough in response to information, where enough means as a Bayesian decision maker would. This chapter provides axiomatic foundations for sticky beliefs by weakening the standard axioms of dynamic consistency and consequentialism. I derive a representation in which updated beliefs are a convex combination of the prior and the Bayesian posterior. A unique parameter captures the weight on the prior and is interpreted as the agent's measure of belief stickiness or conservatism bias. This parameter is endogenously identified from preferences and is easily elicited from experimental data.
The third chapter deals with updating in the face of ambiguity, using the framework of Gilboa and Schmeidler. There is no consensus on the correct way way to update a set of priors. Current methods either do not allow a decision maker to make an inference about her priors or require an extreme level of inference. In this chapter I propose and axiomatize a general model of updating a set of priors. A decision maker who updates her beliefs in accordance with the model can be thought of as one that chooses a threshold that is used to determine whether a prior is plausible, given some observation. She retains the plausible priors and applies Bayes' rule. This model includes generalized Bayesian updating and maximum likelihood updating as special cases.
Resumo:
STEEL, the Caltech created nonlinear large displacement analysis software, is currently used by a large number of researchers at Caltech. However, due to its complexity, lack of visualization tools (such as pre- and post-processing capabilities) rapid creation and analysis of models using this software was difficult. SteelConverter was created as a means to facilitate model creation through the use of the industry standard finite element solver ETABS. This software allows users to create models in ETABS and intelligently convert model information such as geometry, loading, releases, fixity, etc., into a format that STEEL understands. Models that would take several days to create and verify now take several hours or less. The productivity of the researcher as well as the level of confidence in the model being analyzed is greatly increased.
It has always been a major goal of Caltech to spread the knowledge created here to other universities. However, due to the complexity of STEEL it was difficult for researchers or engineers from other universities to conduct analyses. While SteelConverter did help researchers at Caltech improve their research, sending SteelConverter and its documentation to other universities was less than ideal. Issues of version control, individual computer requirements, and the difficulty of releasing updates made a more centralized solution preferred. This is where the idea for Caltech VirtualShaker was born. Through the creation of a centralized website where users could log in, submit, analyze, and process models in the cloud, all of the major concerns associated with the utilization of SteelConverter were eliminated. Caltech VirtualShaker allows users to create profiles where defaults associated with their most commonly run models are saved, and allows them to submit multiple jobs to an online virtual server to be analyzed and post-processed. The creation of this website not only allowed for more rapid distribution of this tool, but also created a means for engineers and researchers with no access to powerful computer clusters to run computationally intensive analyses without the excessive cost of building and maintaining a computer cluster.
In order to increase confidence in the use of STEEL as an analysis system, as well as verify the conversion tools, a series of comparisons were done between STEEL and ETABS. Six models of increasing complexity, ranging from a cantilever column to a twenty-story moment frame, were analyzed to determine the ability of STEEL to accurately calculate basic model properties such as elastic stiffness and damping through a free vibration analysis as well as more complex structural properties such as overall structural capacity through a pushover analysis. These analyses showed a very strong agreement between the two softwares on every aspect of each analysis. However, these analyses also showed the ability of the STEEL analysis algorithm to converge at significantly larger drifts than ETABS when using the more computationally expensive and structurally realistic fiber hinges. Following the ETABS analysis, it was decided to repeat the comparisons in a software more capable of conducting highly nonlinear analysis, called Perform. These analyses again showed a very strong agreement between the two softwares in every aspect of each analysis through instability. However, due to some limitations in Perform, free vibration analyses for the three story one bay chevron brace frame, two bay chevron brace frame, and twenty story moment frame could not be conducted. With the current trend towards ultimate capacity analysis, the ability to use fiber based models allows engineers to gain a better understanding of a building’s behavior under these extreme load scenarios.
Following this, a final study was done on Hall’s U20 structure [1] where the structure was analyzed in all three softwares and their results compared. The pushover curves from each software were compared and the differences caused by variations in software implementation explained. From this, conclusions can be drawn on the effectiveness of each analysis tool when attempting to analyze structures through the point of geometric instability. The analyses show that while ETABS was capable of accurately determining the elastic stiffness of the model, following the onset of inelastic behavior the analysis tool failed to converge. However, for the small number of time steps the ETABS analysis was converging, its results exactly matched those of STEEL, leading to the conclusion that ETABS is not an appropriate analysis package for analyzing a structure through the point of collapse when using fiber elements throughout the model. The analyses also showed that while Perform was capable of calculating the response of the structure accurately, restrictions in the material model resulted in a pushover curve that did not match that of STEEL exactly, particularly post collapse. However, such problems could be alleviated by choosing a more simplistic material model.
Resumo:
FRAME3D, a program for the nonlinear seismic analysis of steel structures, has previously been used to study the collapse mechanisms of steel buildings up to 20 stories tall. The present thesis is inspired by the need to conduct similar analysis for much taller structures. It improves FRAME3D in two primary ways.
First, FRAME3D is revised to address specific nonlinear situations involving large displacement/rotation increments, the backup-subdivide algorithm, element failure, and extremely narrow joint hysteresis. The revisions result in superior convergence capabilities when modeling earthquake-induced collapse. The material model of a steel fiber is also modified to allow for post-rupture compressive strength.
Second, a parallel FRAME3D (PFRAME3D) is developed. The serial code is optimized and then parallelized. A distributed-memory divide-and-conquer approach is used for both the global direct solver and element-state updates. The result is an implicit finite-element hybrid-parallel program that takes advantage of the narrow-band nature of very tall buildings and uses nearest-neighbor-only communication patterns.
Using three structures of varied sized, PFRAME3D is shown to compute reproducible results that agree with that of the optimized 1-core version (displacement time-history response root-mean-squared errors are ~〖10〗^(-5) m) with much less wall time (e.g., a dynamic time-history collapse simulation of a 60-story building is computed in 5.69 hrs with 128 cores—a speedup of 14.7 vs. the optimized 1-core version). The maximum speedups attained are shown to increase with building height (as the total number of cores used also increases), and the parallel framework can be expected to be suitable for buildings taller than the ones presented here.
PFRAME3D is used to analyze a hypothetical 60-story steel moment-frame tube building (fundamental period of 6.16 sec) designed according to the 1994 Uniform Building Code. Dynamic pushover and time-history analyses are conducted. Multi-story shear-band collapse mechanisms are observed around mid-height of the building. The use of closely-spaced columns and deep beams is found to contribute to the building's “somewhat brittle” behavior (ductility ratio ~2.0). Overall building strength is observed to be sensitive to whether a model is fracture-capable.
Resumo:
This thesis presents methods for incrementally constructing controllers in the presence of uncertainty and nonlinear dynamics. The basic setting is motion planning subject to temporal logic specifications. Broadly, two categories of problems are treated. The first is reactive formal synthesis when so-called discrete abstractions are available. The fragment of linear-time temporal logic (LTL) known as GR(1) is used to express assumptions about an adversarial environment and requirements of the controller. Two problems of changes to a specification are posed that concern the two major aspects of GR(1): safety and liveness. Algorithms providing incremental updates to strategies are presented as solutions. In support of these, an annotation of strategies is developed that facilitates repeated modifications. A variety of properties are proven about it, including necessity of existence and sufficiency for a strategy to be winning. The second category of problems considered is non-reactive (open-loop) synthesis in the absence of a discrete abstraction. Instead, the presented stochastic optimization methods directly construct a control input sequence that achieves low cost and satisfies a LTL formula. Several relaxations are considered as heuristics to address the rarity of sampling trajectories that satisfy an LTL formula and demonstrated to improve convergence rates for Dubins car and single-integrators subject to a recurrence task.