901 resultados para Installment schedule


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dynamic analysis techniques have been proposed to detect potential deadlocks. Analyzing and comprehending each potential deadlock to determine whether the deadlock is feasible in a real execution requires significant programmer effort. Moreover, empirical evidence shows that existing analyses are quite imprecise. This imprecision of the analyses further void the manual effort invested in reasoning about non-existent defects. In this paper, we address the problems of imprecision of existing analyses and the subsequent manual effort necessary to reason about deadlocks. We propose a novel approach for deadlock detection by designing a dynamic analysis that intelligently leverages execution traces. To reduce the manual effort, we replay the program by making the execution follow a schedule derived based on the observed trace. For a real deadlock, its feasibility is automatically verified if the replay causes the execution to deadlock. We have implemented our approach as part of WOLF and have analyzed many large (upto 160KLoC) Java programs. Our experimental results show that we are able to identify 74% of the reported defects as true (or false) positives automatically leaving very few defects for manual analysis. The overhead of our approach is negligible making it a compelling tool for practical adoption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In WSNs the communication traffic is often time and space correlated, where multiple nodes in a proximity start transmitting simultaneously. Such a situation is known as spatially correlated contention. The random access method to resolve such contention suffers from high collision rate, whereas the traditional distributed TDMA scheduling techniques primarily try to improve the network capacity by reducing the schedule length. Usually, the situation of spatially correlated contention persists only for a short duration, and therefore generating an optimal or suboptimal schedule is not very useful. Additionally, if an algorithm takes very long time to schedule, it will not only introduce additional delay in the data transfer but also consume more energy. In this paper, we present a distributed TDMA slot scheduling (DTSS) algorithm, which considerably reduces the time required to perform scheduling, while restricting the schedule length to the maximum degree of interference graph. The DTSS algorithm supports unicast, multicast, and broadcast scheduling, simultaneously without any modification in the protocol. We have analyzed the protocol for average case performance and also simulated it using Castalia simulator to evaluate its runtime performance. Both analytical and simulation results show that our protocol is able to considerably reduce the time required for scheduling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computing the maximum of sensor readings arises in several environmental, health, and industrial monitoring applications of wireless sensor networks (WSNs). We characterize the several novel design trade-offs that arise when green energy harvesting (EH) WSNs, which promise perpetual lifetimes, are deployed for this purpose. The nodes harvest renewable energy from the environment for communicating their readings to a fusion node, which then periodically estimates the maximum. For a randomized transmission schedule in which a pre-specified number of randomly selected nodes transmit in a sensor data collection round, we analyze the mean absolute error (MAE), which is defined as the mean of the absolute difference between the maximum and that estimated by the fusion node in each round. We optimize the transmit power and the number of scheduled nodes to minimize the MAE, both when the nodes have channel state information (CSI) and when they do not. Our results highlight how the optimal system operation depends on the EH rate, availability and cost of acquiring CSI, quantization, and size of the scheduled subset. Our analysis applies to a general class of sensor reading and EH random processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The time division multiple access (TDMA) based channel access mechanisms perform better than the contention based channel access mechanisms, in terms of channel utilization, reliability and power consumption, specially for high data rate applications in wireless sensor networks (WSNs). Most of the existing distributed TDMA scheduling techniques can be classified as either static or dynamic. The primary purpose of static TDMA scheduling algorithms is to improve the channel utilization by generating a schedule of smaller length. But, they usually take longer time to schedule, and hence, are not suitable for WSNs, in which the network topology changes dynamically. On the other hand, dynamic TDMA scheduling algorithms generate a schedule quickly, but they are not efficient in terms of generated schedule length. In this paper, we propose a novel scheme for TDMA scheduling in WSNs, which can generate a compact schedule similar to static scheduling algorithms, while its runtime performance can be matched with those of dynamic scheduling algorithms. Furthermore, the proposed distributed TDMA scheduling algorithm has the capability to trade-off schedule length with the time required to generate the schedule. This would allow the developers of WSNs, to tune the performance, as per the requirement of prevalent WSN applications, and the requirement to perform re-scheduling. Finally, the proposed TDMA scheduling is fault-tolerant to packet loss due to erroneous wireless channel. The algorithm has been simulated using the Castalia simulator to compare its performance with those of others in terms of generated schedule length and the time required to generate the TDMA schedule. Simulation results show that the proposed algorithm generates a compact schedule in a very less time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In wireless sensor networks (WSNs), contention occurs when two or more nodes in a proximity simultaneously try to access the channel. The contention causes collisions, which are very likely to occur when traffic is correlated. The excessive collision not only affects the reliability and the QoS of the application, but also the lifetime of the network. It is well-known that random access mechanisms do not efficiently handle correlated-contention, and therefore, suffer from high collision rate. Most of the existing TDMA scheduling techniques try to find an optimal or a sub-optimal schedule. Usually, the situation of correlated-contention persists only for a short duration, and therefore, it is not worthwhile to take a long time to generate an optimal or a sub-optimal schedule. We propose a randomized distributed TDMA scheduling (RD-TDMA) algorithm to quickly generate a feasible schedule (not necessarily optimal) to handle correlated-contention in WSNs. In RD-TDMA, a node in the network negotiates a slot with its neighbors using the message exchange mechanism. The proposed protocol has been simulated using the Castalia simulator to evaluate its runtime performance. Simulation results show that the RD-TDMA algorithm considerably reduces the time required to schedule.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In metropolitan cities, public transportation service plays a vital role in mobility of people, and it has to introduce new routes more frequently due to the fast development of the city in terms of population growth and city size. Whenever there is introduction of new route or increase in frequency of buses, the nonrevenue kilometers covered by the buses increases as depot and route starting/ending points are at different places. This non-revenue kilometers or dead kilometers depends on the distance between depot and route starting point/ending point. The dead kilometers not only results in revenue loss but also results in an increase in the operating cost because of the extra kilometers covered by buses. Reduction of dead kilometers is necessary for the economic growth of the public transportation system. Therefore, in this study, the attention is focused on minimizing dead kilometers by optimizing allocation of buses to depots depending upon the shortest distance between depot and route starting/ending points. We consider also depot capacity and time period of operation during allocation of buses to ensure parking safety and proper maintenance of buses. Mathematical model is developed considering the aforementioned parameters, which is a mixed integer program, and applied to Bangalore Metropolitan Transport Corporation (BMTC) routes operating presently in order to obtain optimal bus allocation to depots. Database for dead kilometers of depots in BMTC for all the schedules are generated using the Form-4 (trip sheet) of each schedule to analyze depot-wise and division-wise dead kilometers. This study also suggests alternative locations where depots can be located to reduce dead kilometers. Copyright (C) 2015 John Wiley & Sons, Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The polyhedral model provides an expressive intermediate representation that is convenient for the analysis and subsequent transformation of affine loop nests. Several heuristics exist for achieving complex program transformations in this model. However, there is also considerable scope to utilize this model to tackle the problem of automatic memory footprint optimization. In this paper, we present a new automatic storage optimization technique which can be used to achieve both intra-array as well as inter-array storage reuse with a pre-determined schedule for the computation. Our approach works by finding statement-wise storage partitioning hyper planes that partition a unified global array space so that values with overlapping live ranges are not mapped to the same partition. Our heuristic is driven by a fourfold objective function which not only minimizes the dimensionality and storage requirements of arrays required for each high-level statement, but also maximizes inter statement storage reuse. The storage mappings obtained using our heuristic can be asymptotically better than those obtained by any existing technique. We implement our technique and demonstrate its practical impact by evaluating its effectiveness on several benchmarks chosen from the domains of image processing, stencil computations, and high-performance computing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estudo descritivo, desenvolvido sob o enfoque do novo institucionalismo, que trata do sobrestamento de pauta resultante da aplicação das regras constitucionais e regimentais. Busca identificar as razões da gênese desse instituto no presidencialismo brasileiro, assim como entender seu papel tanto na tramitação das proposições para as quais foi concebido, quanto nos trabalhos dos Plenários das Casas do Congresso Nacional. Procura compreender os mecanismos por meio dos quais o sobrestamento de pauta, além de garantir a apreciação célere de algumas proposições consideradas urgentes, contribui decisivamente para transformar o chefe do Poder Executivo no principal controlador da agenda do processo legislativo brasileiro.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY WORKSHOP OVERVIEW Introduction Goals and objectives of the workshop Organizing committee, participants, sponsors and venue Workshop activity NEMURO.FISH COUPLED WITH A POPULATION DYNAMICS MODEL (SAURY) Introduction One cohort case with no reproduction Two (overlapping) cohort scenario with no reproduction Two-cohort case with no reproduction and body size-dependent mortality Two-cohort case with reproduction and KL-dependent mortality Conclusions and future perspectives LAGRANGIAN MODEL OF NEMURO.FISH Tasks and members Description of model and preliminary results Future tasks COUPLING NEMURO TO HERRING BIOENERGETICS Overview Details of the NEMURO_Herring model Example simulation of NEMURO_Herring Future plans REFERENCES APPENDICES Workshop participants Workshop schedule Lagrangian model (FORTRAN program) (55 page document)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The European Commission Report on Competition in Professional Services found that recommended prices by professional bodies have a significant negative effect on competition since they may facilitate the coordination of prices between service providers and/or mislead consumers about reasonable price levels. Professional associations argue, first, that a fee schedule may help their members to properly calculate the cost of services avoiding excessive charges and reducing consumers’ searching costs and, second, that recommended prices are very useful for cost appraisal if a litigant is condemned to pay the legal expenses of the opposing party. Thus, recommended fee schedules could be justified to some extent if they represented the cost of providing the services. We test this hypothesis using cross‐section data on a subset of recommended prices by 52 Spanish bar associations and cost data on their territorial jurisdictions. Our empirical results indicate that prices recommended by bar associations are unrelated to the cost of legal services and therefore we conclude that recommended prices have merely an anticompetitive effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An assessment of the status of the Atlantic stock of red drum is conducted using recreational and commercial data from 1986 through 1998. This assessment updates data and analyses from the 1989, 1991, 1992 and 1995 stock assessments on Atlantic coast red drum (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996). Since 1981, coastwide recreational catches ranged between 762,300 pounds in 1980 and 2,623,900 pounds in 1984, while commercial landings ranged between 60,900 pounds in 1997 and 422,500 pounds in 1984. In weight of fish caught, Atlantic red drum constitute predominantly a recreational fishery (ranging between 85 and 95% during the 1990s). Commercially, red drum continue to be harvested as part of mixed species fisheries. Using available length-frequency distributions and age-length keys, recreational and commercial catches are converted to catch in numbers at age. Separable and tuned virtual population analyses are conducted on the catch in numbers at age to obtain estimates of fishing mortality rates and population size (including recruitment to age 1). In tum, these estimates of fishing mortality rates combined with estimates of growth (length and weight), sex ratios, sexual maturity and fecundity are used to estimate yield per recruit, escapement to age 4, and static (or equilibrium) spawning potential ratio (static SPR, based on both female biomass and egg production). Three virtual analysis approaches (separable, spreadsheet, and FADAPT) were applied to catch matrices for two time periods (early: 1986-1991, and late: 1992-1998) and two regions (Northern: North Carolina and north, and Southern: South Carolina through east coast of Florida). Additional catch matrices were developed based on different treatments for the catch-and-release recreationally-caught red drum (B2-type). These approaches included assuming 0% mortality (BASEO) versus 10% mortality for B2 fish. For the 10% mortality on B2 fish, sizes were assumed the same as caught fish (BASEl), or positive difference in size distribution between the early period and the later period (DELTA), or intermediate (PROP). Hence, a total of 8 catch matrices were developed (2 regions, and 4 B2 assumptions for 1986-1998) to which the three VPA approaches were applied. The question of when offshore emigration or reduced availability begins (during or after age 3) continues to be a source of bias that tends to result in overestimates of fishing mortality. Additionally, the continued assumption (Vaughan and Helser, 1990; Vaughan 1992; 1993; 1996) of no fishing mortality on adults (ages 6 and older), causes a bias that results in underestimates of fishing mortality for adult ages (0 versus some positive value). Because of emigration and the effect of the slot limit for the later period, a range in relative exploitations of age 3 to age 2 red drum was considered. Tuning indices were developed from the MRFSS, and state indices for use in the spreadsheet and FADAPT VPAs. The SAFMC Red Drum Assessment Group (Appendix A) favored the FADAPT approach with catch matrix based on DELTA and a selectivity for age 3 relative to age 2 of 0.70 for the northern region and 0.87 for the southern region. In the northern region, estimates of static SPR increased from about 1.3% for the period 1987-1991 to approximately 18% (15% and 20%) for the period 1992-1998. For the southern region, estimates of static SPR increased from about 0.5% for the period 1988-1991 to approximately 15% for the period 1992-1998. Population models used in this assessment (specifically yield per recruit and static spawning potential ratio) are based on equilibrium assumptions: because no direct estimates are available as to the current status of the adult stock, model results imply potential longer term, equilibrium effects. Because current status of the adult stock is unknown, a specific rebuilding schedule cannot be determined. However, the duration of a rebuilding schedule should reflect, in part, a measure of the generation time of the fish species under consideration. For a long-lived, but relatively early spawning, species as red drum, mean generation time would be on the order of 15 to 20 years based on age-specific egg production. Maximum age is 50 to 60 years for the northern region, and about 40 years for the southern region. The ASMFC Red Drum Board's first phase recovery goal of increasing %SPR to at least 10% appears to have been met. (PDF contains 79 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chelmsford College has created an observation, appraisal and continuing professional development (CPD) cycle by successfully integrating a collection of bespoke web-based systems together within its intranet. Students have benefitted from improved teaching and learning because of the rapid, transparent and thorough cycle of staff being observed, appraised and given appropriate CPD. The College has also saved time and money by being able to use single-source data to schedule observations, appraisals and CPD to individuals' needs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate the practice of inclusive design in an industrial context and to gain an insight into the industrial perspectives, eight UK design consultancies' participation of the DBA design Challenges were reviewed through formal interviews. It is found that progress has been made in raising inclusive design awareness. However, some useful practices such as the user involvement in the design process is found not feasible in real situation, largely because of the often tight schedule and the complexity of the task. Consequently effective ways of capturing user information needs exploration and accessible design support tools need to be provided, through working with designers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This document presents the results of the monitoring of a repaired coral reef injured by the M/V Connected vessel grounding incident of March 27, 2001. This grounding occurred in Florida state waters within the boundaries of the Florida Keys National Marine Sanctuary (FKNMS). The National Oceanic and Atmospheric Administration (NOAA) and the Board of Trustees of the Internal Improvement Trust Fund of the State of Florida, (“State of Florida” or “state”) are the co-trustees for the natural resources within the FKNMS and, thus, are responsible for mediating the restoration of the damaged marine resources and monitoring the outcome of the restoration actions. The restoration monitoring program tracks patterns of biological recovery, determines the success of restoration measures, and assesses the resiliency to environmental and anthropogenic disturbances of the site over time. The monitoring program at the Connected site was to have included an assessment of the structural stability of installed restoration modules and biological condition of reattached corals performed on the following schedule: immediately (i.e., baseline), 1, 3, and 6 years after restoration and following a catastrophic event. Restoration of this site was completed on July 20, 2001. Due to unavoidable delays in the settlement of the case, the “baseline” monitoring event for this site occurred in July 2004. The catastrophic monitoring event occurred on August 31, 2004, some 2 ½ weeks after the passage of Hurricane Charley which passed nearby, almost directly over the Dry Tortugas. In September 2005, the year one monitoring event occurred shortly after the passage of Hurricane Katrina, some 70 km to the NW. This report presents the results of all three monitoring events. (PDF contains 37 pages.)