991 resultados para time-consistency
Resumo:
Economic models of crime and punishment implicitly assume that the government can credibly commit to the fines, sentences, and apprehension rates it has chosen. We study the government's problem when credibility is an issue. We find that several of the standard predictions of the economic model of crime and punishment are robust to commitment, but that credibility may in some cases result in lower apprehension rates, and hence a higher crime rate, compared to the static version of the model.
Resumo:
Energy Conservation Measure (ECM) project selection is made difficult given real-world constraints, limited resources to implement savings retrofits, various suppliers in the market and project financing alternatives. Many of these energy efficient retrofit projects should be viewed as a series of investments with annual returns for these traditionally risk-averse agencies. Given a list of ECMs available, federal, state and local agencies must determine how to implement projects at lowest costs. The most common methods of implementation planning are suboptimal relative to cost. Federal, state and local agencies can obtain greater returns on their energy conservation investment over traditional methods, regardless of the implementing organization. This dissertation outlines several approaches to improve the traditional energy conservations models. Any public buildings in regions with similar energy conservation goals in the United States or internationally can also benefit greatly from this research. Additionally, many private owners of buildings are under mandates to conserve energy e.g., Local Law 85 of the New York City Energy Conservation Code requires any building, public or private, to meet the most current energy code for any alteration or renovation. Thus, both public and private stakeholders can benefit from this research. The research in this dissertation advances and presents models that decision-makers can use to optimize the selection of ECM projects with respect to the total cost of implementation. A practical application of a two-level mathematical program with equilibrium constraints (MPEC) improves the current best practice for agencies concerned with making the most cost-effective selection leveraging energy services companies or utilities. The two-level model maximizes savings to the agency and profit to the energy services companies (Chapter 2). An additional model presented leverages a single congressional appropriation to implement ECM projects (Chapter 3). Returns from implemented ECM projects are used to fund additional ECM projects. In these cases, fluctuations in energy costs and uncertainty in the estimated savings severely influence ECM project selection and the amount of the appropriation requested. A risk aversion method proposed imposes a minimum on the number of “of projects completed in each stage. A comparative method using Conditional Value at Risk is analyzed. Time consistency was addressed in this chapter. This work demonstrates how a risk-based, stochastic, multi-stage model with binary decision variables at each stage provides a much more accurate estimate for planning than the agency’s traditional approach and deterministic models. Finally, in Chapter 4, a rolling-horizon model allows for subadditivity and superadditivity of the energy savings to simulate interactive effects between ECM projects. The approach makes use of inequalities (McCormick, 1976) to re-express constraints that involve the product of binary variables with an exact linearization (related to the convex hull of those constraints). This model additionally shows the benefits of learning between stages while remaining consistent with the single congressional appropriations framework.
Resumo:
This paper tests the internal consistency of time trade-off utilities.We find significant violations of consistency in the direction predictedby loss aversion. The violations disappear for higher gauge durations.We show that loss aversion can also explain that for short gaugedurations time trade-off utilities exceed standard gamble utilities. Ourresults suggest that time trade-off measurements that use relativelyshort gauge durations, like the widely used EuroQol algorithm(Dolan 1997), are affected by loss aversion and lead to utilities thatare too high.
Resumo:
This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.
Resumo:
Quantum field theory with an external background can be considered as a consistent model only if backreaction is relatively small with respect to the background. To find the corresponding consistency restrictions on an external electric field and its duration in QED and QCD, we analyze the mean-energy density of quantized fields for an arbitrary constant electric field E, acting during a large but finite time T. Using the corresponding asymptotics with respect to the dimensionless parameter eET(2), one can see that the leading contributions to the energy are due to the creation of particles by the electric field. Assuming that these contributions are small in comparison with the energy density of the electric background, we establish the above-mentioned restrictions, which determine, in fact, the time scales from above of depletion of an electric field due to the backreaction.
Resumo:
Time-domain reflectometry (TDR) is an important technique to obtain series of soil water content measurements in the field. Diode-segmented probes represent an improvement in TDR applicability, allowing measurements of the soil water content profile with a single probe. In this paper we explore an extensive soil water content dataset obtained by tensiometry and TDR from internal drainage experiments in two consecutive years in a tropical soil in Brazil. Comparisons between the variation patterns of the water content estimated by both methods exhibited evidences of deterioration of the TDR system during this two year period at field conditions. The results showed consistency in the variation pattern for the tensiometry data, whereas TDR estimates were inconsistent, with sensitivity decreasing over time. This suggests that difficulties may arise for the long-term use of this TDR system under tropical field conditions. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
Background & aim: Many disease outbreaks of food origin are caused by foods prepared in Food Service and Nutrition Units of hospitals, affecting hospitalized patients who, in most cases, are immunocompromised and therefore at a higher risk of severe worsening of their clinical status. The aim of this study was to determine the variations in temperature and the time-temperature factor of hospital diets. Methods: The time and temperature for the preparation of 4 diets of modified consistency were determined on 5 nonconsecutive days in a hospital Diet and Nutrition Unit at the end of preparation and during the maintenance period, portioning and distribution at 3 sites, i.e., the first, the middle and the last to receive the diets. Results and discussion: All foods reached an adequate temperature at the end of cooking, but temperature varied significantly from the maintenance period to the final distribution, characterizing critical periods for microorganism proliferation. During holding, temperatures that presented a risk were reached by 16.7% of the meats and 59% of the salads of the general diet, by 16.7% of the garnishes in the bland diet and by 20% of the meats and garnishes in the viscous diet. The same occurred at the end of distribution for 100% of the hot samples and of the salads and for 61% of the desserts. None of the preparations remained at risk temperature for a time exceeding that established by law. Conclusion: The exposure to inadequate temperature did not last long enough to pose risks to the patient.
Resumo:
The aim of this research was to examine the nature and order of recovery of orientation and memory functioning during Post-Traumatic Amnesia (PTA) in relation to injury severity and PTA duration. The Westmead PTA Scale was used across consecutive testing days to assess the recovery of orientation and memory during PTA in 113 patients. Two new indices were examined: a Consistency-of-Recovery and a Duration-to-Recovery index. a predictable order of recovery was observed during PTA: orientation-to-person recovered sooner and more consistently than the following cluster; orientation-to-time, orientation-to-place, and the ability to remember a face and name. However, the type of memory functioning required for the recall face and name task recovered more consistently than that required for memorizing three pictures. An important overall finding was that the order-of-recovery'' of orientation and memory functioning was dependent upon both the elapsed days since injury, and the consistency of recovery. The newly developed indices were shown to be a valuable means of accounting for differences between groups in the elapsed days to recovery of orientation and memory. These indices also clearly increase the clinical utility of the Westmead PTA Scale and supply an objective means of charting (and potentially predicting) patients' recovery on the different components of orientation and memory throughout their period of hospitalization.
Resumo:
The importance of disturbance and the subsequent rate and pattern of recovery has been long recognised as an important driver of community structure. Community recovery is affected by processes operating at local and regional scales yet the examination of community level responses to a standardised disturbance at regional scales (i.e. among regions under different environmental conditions) has seldom been attempted. Here, we mechanically disturbed rocky intertidal lower shore algal dominated assemblages at three locations within each of three different regions within the Lusitanian biogeographical province (Azores, northern Portugal and the Canary Islands). All organisms were cleared from experimental plots and succession followed over a period of 12 months at which time we formally compared the assemblage structure to that of unmanipulated controls. Early patterns of recovery of disturbed communities varied among regions and was positively influenced by temperature, but not by regional species richness. Different components of the assemblage responded differently to disturbance. Regional differences in the relative abundance and identity of species had a key influence on the overall assemblage recovery. This study highlights how regional-scales differences in environmental conditions and species pool are important determinants of recovery of disturbed communities.
Resumo:
In this paper, we present some of the fault tolerance management mechanisms being implemented in the Multi-μ architecture, namely its support for replica non-determinism. In this architecture, fault tolerance is achieved by node active replication, with software based replica management and fault tolerance transparent algorithms. A software layer implemented between the application and the real-time kernel, the Fault Tolerance Manager (FTManager), is the responsible for the transparent incorporation of the fault tolerance mechanisms The active replication model can be implemented either imposing replica determinism or keeping replica consistency at critical points, by means of interactive agreement mechanisms. One of the Multi-μ architecture goals is to identify such critical points, relieving the underlying system from performing the interactive agreement in every Ada dispatching point.
Resumo:
The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.
Resumo:
Task scheduling is one of the key mechanisms to ensure timeliness in embedded real-time systems. Such systems have often the need to execute not only application tasks but also some urgent routines (e.g. error-detection actions, consistency checkers, interrupt handlers) with minimum latency. Although fixed-priority schedulers such as Rate-Monotonic (RM) are in line with this need, they usually make a low processor utilization available to the system. Moreover, this availability usually decreases with the number of considered tasks. If dynamic-priority schedulers such as Earliest Deadline First (EDF) are applied instead, high system utilization can be guaranteed but the minimum latency for executing urgent routines may not be ensured. In this paper we describe a scheduling model according to which urgent routines are executed at the highest priority level and all other system tasks are scheduled by EDF. We show that the guaranteed processor utilization for the assumed scheduling model is at least as high as the one provided by RM for two tasks, namely 2(2√−1). Seven polynomial time tests for checking the system timeliness are derived and proved correct. The proposed tests are compared against each other and to an exact but exponential running time test.
Resumo:
Recent embedded processor architectures containing multiple heterogeneous cores and non-coherent caches renewed attention to the use of Software Transactional Memory (STM) as a building block for developing parallel applications. STM promises to ease concurrent and parallel software development, but relies on the possibility of abort conflicting transactions to maintain data consistency, which in turns affects the execution time of tasks carrying transactions. Because of this fact the timing behaviour of the task set may not be predictable, thus it is crucial to limit the execution time overheads resulting from aborts. In this paper we formalise a FIFO-based algorithm to order the sequence of commits of concurrent transactions. Then, we propose and evaluate two non-preemptive and one SRP-based fully-preemptive scheduling strategies, in order to avoid transaction starvation.
Resumo:
Real-time collaborative editing systems are common nowadays, and their advantages are widely recognized. Examples of such systems include Google Docs, ShareLaTeX, among others. This thesis aims to adopt this paradigm in a software development environment. The OutSystems visual language lends itself very appropriate to this kind of collaboration, since the visual code enables a natural flow of knowledge between developers regarding the developed code. Furthermore, communication and coordination are simplified. This proposal explores the field of collaboration on a very structured and rigid model, where collaboration is made through the copy-modify-merge paradigm, in which a developer gets its own private copy from the shared repository, modifies it in isolation and later uploads his changes to be merged with modifications concurrently produced by other developers. To this end, we designed and implemented an extension to the OutSystems Platform, in order to enable real-time collaborative editing. The solution guarantees consistency among the artefacts distributed across several developers working on the same project. We believe that it is possible to achieve a much more intense collaboration over the same models with a low negative impact on the individual productivity of each developer.