849 resultados para 289900 Other Information, Computing and Communication Sciences
Resumo:
Mavron, Vassili; McDonough, T.P.; Key, J.D., (2006) 'Information sets and partial permutation decoding for codes from finite geometries', Finite Fields and their applications 12(2) pp.232-247 RAE2008
Resumo:
In this paper, we present Slack Stealing Job Admission Control (SSJAC)---a methodology for scheduling periodic firm-deadline tasks with variable resource requirements, subject to controllable Quality of Service (QoS) constraints. In a system that uses Rate Monotonic Scheduling, SSJAC augments the slack stealing algorithm of Thuel et al with an admission control policy to manage the variability in the resource requirements of the periodic tasks. This enables SSJAC to take advantage of the 31\% of utilization that RMS cannot use, as well as any utilization unclaimed by jobs that are not admitted into the system. Using SSJAC, each task in the system is assigned a resource utilization threshold that guarantees the minimal acceptable QoS for that task (expressed as an upper bound on the rate of missed deadlines). Job admission control is used to ensure that (1) only those jobs that will complete by their deadlines are admitted, and (2) tasks do not interfere with each other, thus a job can only monopolize the slack in the system, but not the time guaranteed to jobs of other tasks. We have evaluated SSJAC against RMS and Statistical RMS (SRMS). Ignoring overhead issues, SSJAC consistently provides better performance than RMS in overload, and, in certain conditions, better performance than SRMS. In addition, to evaluate optimality of SSJAC in an absolute sense, we have characterized the performance of SSJAC by comparing it to an inefficient, yet optimal scheduler for task sets with harmonic periods.
Resumo:
We present a thorough characterization of the access patterns in blogspace -- a fast-growing constituent of the content available through the Internet -- which comprises a rich interconnected web of blog postings and comments by an increasingly prominent user community that collectively define what has become known as the blogosphere. Our characterization of over 35 million read, write, and administrative requests spanning a 28-day period is done from three different blogosphere perspectives. The server view characterizes the aggregate access patterns of all users to all blogs; the user view characterizes how individual users interact with blogosphere objects (blogs); the object view characterizes how individual blogs are accessed. Our findings support two important conclusions. First, we show that the nature of interactions between users and objects is fundamentally different in blogspace than that observed in traditional web content. Access to objects in blogspace could be conceived as part of an interaction between an author and its readership. As we show in our work, such interactions range from one-to-many "broadcast-type" and many-to-one "registration-type" communication between an author and its readers, to multi-way, iterative "parlor-type" dialogues among members of an interest group. This more-interactive nature of the blogosphere leads to interesting traffic and communication patterns, which are different from those observed in traditional web content. Second, we identify and characterize novel features of the blogosphere workload, and we investigate the similarities and differences between typical web server workloads and blogosphere server workloads. Given the increasing share of blogspace traffic, understanding such differences is important for capacity planning and traffic engineering purposes, for example.
Resumo:
Mapping novel terrain from sparse, complex data often requires the resolution of conflicting information from sensors working at different times, locations, and scales, and from experts with different goals and situations. Information fusion methods help resolve inconsistencies in order to distinguish correct from incorrect answers, as when evidence variously suggests that an object's class is car, truck, or airplane. The methods developed here consider a complementary problem, supposing that information from sensors and experts is reliable though inconsistent, as when evidence suggests that an objects class is car, vehicle, or man-made. Underlying relationships among objects are assumed to be unknown to the automated system of the human user. The ARTMAP information fusion system uses distributed code representations that exploit the neural network's capacity for one-to-many learning in order to produce self-organizing expert systems that discover hierarchial knowledge structures. The system infers multi-level relationships among groups of output classes, without any supervised labeling of these relationships. The procedure is illustrated with two image examples.
Resumo:
Classifying novel terrain or objects from sparse, complex data may require the resolution of conflicting information from sensors woring at different times, locations, and scales, and from sources with different goals and situations. Information fusion methods can help resolve inconsistencies, as when eveidence variously suggests that and object's class is car, truck, or airplane. The methods described her address a complementary problem, supposing that information from sensors and experts is reliable though inconsistent, as when evidence suggests that an object's class is car, vehicle, and man-made. Underlying relationships among classes are assumed to be unknown to the autonomated system or the human user. The ARTMAP information fusion system uses distributed code representations that exploit the neural network's capacity for one-to-many learning in order to produce self-organizing expert systems that discover hierachical knowlege structures. The fusion system infers multi-level relationships among groups of output classes, without any supervised labeling of these relationships. The procedure is illustrated with two image examples, but is not limited to image domain.
Resumo:
Background: Hospital clinicians are increasingly expected to practice evidence-based medicine (EBM) in order to minimize medical errors and ensure quality patient care, but experience obstacles to information-seeking. The introduction of a Clinical Informationist (CI) is explored as a possible solution. Aims: This paper investigates the self-perceived information needs, behaviour and skill levels of clinicians in two Irish public hospitals. It also explores clinicians perceptions and attitudes to the introduction of a CI into their clinical teams. Methods: A questionnaire survey approach was utilised for this study, with 22 clinicians in two hospitals. Data analysis was conducted using descriptive statistics. Results: Analysis showed that clinicians experience diverse information needs for patient care, and that barriers such as time constraints and insufficient access to resources hinder their information-seeking. Findings also showed that clinicians struggle to fit information-seeking into their working day, regularly seeking to answer patient-related queries outside of working hours. Attitudes towards the concept of a CI were predominantly positive. Conclusion: This paper highlights the factors that characterise and limit hospital clinicians information-seeking, and suggests the CI as a potentially useful addition to the clinical team, to help them to resolve their information needs for patient care.
Resumo:
This article examines the behavior of equity trading volume and volatility for the individual firms composing the Standard & Poor's 100 composite index. Using multivariate spectral methods, we find that fractionally integrated processes best describe the long-run temporal dependencies in both series. Consistent with a stylized mixture-of-distributions hypothesis model in which the aggregate "news"-arrival process possesses long-memory characteristics, the long-run hyperbolic decay rates appear to be common across each volume-volatility pair.
Resumo:
We describe a general technique for determining upper bounds on maximal values (or lower bounds on minimal costs) in stochastic dynamic programs. In this approach, we relax the nonanticipativity constraints that require decisions to depend only on the information available at the time a decision is made and impose a "penalty" that punishes violations of nonanticipativity. In applications, the hope is that this relaxed version of the problem will be simpler to solve than the original dynamic program. The upper bounds provided by this dual approach complement lower bounds on values that may be found by simulating with heuristic policies. We describe the theory underlying this dual approach and establish weak duality, strong duality, and complementary slackness results that are analogous to the duality results of linear programming. We also study properties of good penalties. Finally, we demonstrate the use of this dual approach in an adaptive inventory control problem with an unknown and changing demand distribution and in valuing options with stochastic volatilities and interest rates. These are complex problems of significant practical interest that are quite difficult to solve to optimality. In these examples, our dual approach requires relatively little additional computation and leads to tight bounds on the optimal values. © 2010 INFORMS.
Resumo:
p.141-151
Resumo:
p.57-70
Resumo:
p.57-70
Resumo:
In this paper we look at ways of delivering and assessing learning on database units offered on higher degree programmes (MSc) in the School of Computing and Mathematical Sciences at the University of Greenwich. Of critical importance is the teaching methods employed for verbal disposition, practical laboratory exercises and a careful evaluation of assessment methods and assessment tools in view of the fact that databases involve not only database design but also use of practical tools, such as database management systems (DBMSs) software, human designers, database administrators (DBA) and end users. Our goal is to clearly identify potential key success factors in delivering and assessing learning in both practical and theoretical aspects of database course units.
Resumo:
Exploring climate and anthropogenic impacts on marine ecosystems requires an understanding of how trophic components interact. However, integrative end-to-end ecosystem studies (experimental and/or modelling) are rare. Experimental investigations often concentrate on a particular group or individual species within a trophic level, while tropho-dynamic field studies typically employ either a bottom-up approach concentrating on the phytoplankton community or a top-down approach concentrating on the fish community. Likewise the emphasis within modelling studies is usually placed upon phytoplankton-dominated biogeochemistry or on aspects of fisheries regulation. In consequence the roles of zooplankton communities (protists and metazoans) linking phytoplankton and fish communities are typically under-represented if not (especially in fisheries models) ignored. Where represented in ecosystem models, zooplankton are usually incorporated in an extremely simplistic fashion, using empirical descriptions merging various interacting physiological functions governing zooplankton growth and development, and thence ignoring physiological feedback mechanisms. Here we demonstrate, within a modelled plankton food-web system, how trophic dynamics are sensitive to small changes in parameter values describing zooplankton vital rates and thus the importance of using appropriate zooplankton descriptors. Through a comprehensive review, we reveal the mismatch between empirical understanding and modelling activities identifying important issues that warrant further experimental and modelling investigation. These include: food selectivity, kinetics of prey consumption and interactions with assimilation and growth, form of voided material, mortality rates at different age-stages relative to prior nutrient history. In particular there is a need for dynamic data series in which predator and prey of known nutrient history are studied interacting under varied pH and temperature regimes.