976 resultados para time constraints


Relevância:

70.00% 70.00%

Publicador:

Resumo:

An anycast flow is a flow that can be connected to any one of the members in a group of designated (replicated) servers (called anycast group). In this paper, we derive a set of formulas for calculating the end-to-end delay bound for the anycast flows and present novel admission control algorithms for anycast flows with real-time constraints. Given such an anycast group, our algorithms can effectively select the paths for anycast flows' admission and connection based on the least end-to-end delay bounds evaluated. We also present a parallel admission control algorithm that can effectively calculate the available paths with a short delay bound for different destinations in the anycast group so that a best path with the shortest delay bound can be chosen.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The flight range of migrating birds depends crucially on the amount of fuel stored by the bird prior to migration or taken up en route at stop-over sites. However, an increase in body mass is associated with an increase in energetic costs, counteracting the benefit of fuel stores. Water imbalance, occurring when water loss exceeds metabolic water production, may constitute another less well recognised problem limiting flight range. The main route of water loss during flight is via the lungs; the rate of loss depends on ambient temperature, relative humidity and ventilatory flow and increases with altitude. Metabolite production results in an increased plasma osmolality, also endangering the proper functioning of the organism during flight. Energetic constraints and water-balance problems may interact in determining several aspects of flight behaviour, such as altitude of flight, mode of flight, lap distance and stop-over duration. To circumvent energetic and water-balance problems, a bird could migrate in short hops instead of long leaps if crossing of large ecological barriers can be avoided. However, although necessitating larger fuel stores and being more expensive, migration by long leaps may sometimes be faster than by short hops. Time constraints are also an important factor in explaining why soaring, which conserves energy and water, occurs exclusively in very large species: small birds can soar at low speeds only. Good navigational skills involving accurate orientation and assessment of altitude and air and ground speed assist in avoiding physiological stress during migration.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Linear regression is a technique widely used in digital signal processing. It consists on finding the linear function that better fits a given set of samples. This paper proposes different hardware architectures for the implementation of the linear regression method on FPGAs, specially targeting area restrictive systems. It saves area at the cost of constraining the lengths of the input signal to some fixed values. We have implemented the proposed scheme in an Automatic Modulation Classifier, meeting the hard real-time constraints this kind of systems have.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Predicting statically the running time of programs has many applications ranging from task scheduling in parallel execution to proving the ability of a program to meet strict time constraints. A starting point in order to attack this problem is to infer the computational complexity of such programs (or fragments thereof). This is one of the reasons why the development of static analysis techniques for inferring cost-related properties of programs (usually upper and/or lower bounds of actual costs) has received considerable attention.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

A disruption predictor based on support vector machines (SVM) has been developed to be used in JET. The training process uses thousands of discharges and, therefore, high performance computing has been necessary to obtain the models. To this respect, several models have been generated with data from different JET campaigns. In addition, various kernels (mainly linear and RBF) and parameters have been tested. The main objective of this work has been the implementation of the predictor model under real-time constraints. A “C-code” software application has been developed to simulate the real-time behavior of the predictor. The application reads the signals from the JET database and simulates the real-time data processing, in particular, the specific data hold method to be developed when reading data from the JET ATM real time network. The simulator is fully configurable by means of text files to select models, signal thresholds, sampling rates, etc. Results with data between campaigns C23and C28 will be shown.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This work describes a neural network based architecture that represents and estimates object motion in videos. This architecture addresses multiple computer vision tasks such as image segmentation, object representation or characterization, motion analysis and tracking. The use of a neural network architecture allows for the simultaneous estimation of global and local motion and the representation of deformable objects. This architecture also avoids the problem of finding corresponding features while tracking moving objects. Due to the parallel nature of neural networks, the architecture has been implemented on GPUs that allows the system to meet a set of requirements such as: time constraints management, robustness, high processing speed and re-configurability. Experiments are presented that demonstrate the validity of our architecture to solve problems of mobile agents tracking and motion analysis.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this project, we propose the implementation of a 3D object recognition system which will be optimized to operate under demanding time constraints. The system must be robust so that objects can be recognized properly in poor light conditions and cluttered scenes with significant levels of occlusion. An important requirement must be met: the system must exhibit a reasonable performance running on a low power consumption mobile GPU computing platform (NVIDIA Jetson TK1) so that it can be integrated in mobile robotics systems, ambient intelligence or ambient assisted living applications. The acquisition system is based on the use of color and depth (RGB-D) data streams provided by low-cost 3D sensors like Microsoft Kinect or PrimeSense Carmine. The range of algorithms and applications to be implemented and integrated will be quite broad, ranging from the acquisition, outlier removal or filtering of the input data and the segmentation or characterization of regions of interest in the scene to the very object recognition and pose estimation. Furthermore, in order to validate the proposed system, we will create a 3D object dataset. It will be composed by a set of 3D models, reconstructed from common household objects, as well as a handful of test scenes in which those objects appear. The scenes will be characterized by different levels of occlusion, diverse distances from the elements to the sensor and variations on the pose of the target objects. The creation of this dataset implies the additional development of 3D data acquisition and 3D object reconstruction applications. The resulting system has many possible applications, ranging from mobile robot navigation and semantic scene labeling to human-computer interaction (HCI) systems based on visual information.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The real-time refinement calculus is a formal method for the systematic derivation of real-time programs from real-time specifications in a style similar to the non-real-time refinement calculi of Back and Morgan. In this paper we extend the real-time refinement calculus with procedures and provide refinement rules for refining real-time specifications to procedure calls. A real-time specification can include constraints on, not only what outputs are produced, but also when they are produced. The derived programs can also include time constraints oil when certain points in the program must be reached; these are expressed in the form of deadline commands. Such programs are machine independent. An important consequence of the approach taken is that, not only are the specifications machine independent, but the whole refinement process is machine independent. To implement the machine independent code on a target machine one has a separate task of showing that the compiled machine code will reach all its deadlines before they expire. For real-time programs, externally observable input and output variables are essential. These differ from local variables in that their values are observable over the duration of the execution of the program. Hence procedures require input and output parameter mechanisms that are references to the actual parameters so that changes to external inputs are observable within the procedure and changes to output parameters are externally observable. In addition, we allow value and result parameters. These may be auxiliary parameters, which are used for reasoning about the correctness of real-time programs as well as in the expression of timing deadlines, but do not lead to any code being generated for them by a compiler. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Business environments have become exceedingly dynamic and competitive in recent times. This dynamism is manifested in the form of changing process requirements and time constraints. Workflow technology is currently one of the most promising fields of research in business process automation. However, workflow systems to date do not provide the flexibility necessary to support the dynamic nature of business processes. In this paper we primarily discuss the issues and challenges related to managing change and time in workflows representing dynamic business processes. We also present an analysis of workflow modifications and provide feasibility considerations for the automation of this process.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Full text: The title of the book gives us a major clue on the innovative approach developed by Anne Freadman in her analysis of a particular Colette corpus, the one devoted to auto-biographical writing: Les Vrilles de la vigne, Mes apprentissages, La Maison de Claudine, Sido ,L’E ́toile Vesper and Le Fanal bleu. Freadman follows the powerful lure of Rimbaldianvieilles vieilleries and its echoes with Colette’s fondness for collecting objects, people and memories. To this must be added a technical aspect, that of the study of the genre of Colette’s writing. Freadman argues that, by largely avoiding the autobiographical form, the writer achieves a new way of ‘telling time’, collecting anecdotes and detail taken from the quotidian and setting them within an all-encompassing preoccupation with time. This provides the second part of the title.The sonata form directs the sequence of the book, orchestrated into five parts,from ‘exposition’ to ‘first subject’ to‘bridge’ to ‘second subject’ to ‘recapitulation’. This has the advantage of enabling Freadman to move and progress between distinct themes—autobiography first,then alternative forms—with grace,whilst preserving within her own writing what she sees as the essence of Colette’s relationship to time in her ‘Livres-Souvenirs’, the telling of time. This‘telling of time’ is itself therefore cleverly subjected to the time constraints and freedoms of musical composition. Freadman’s ‘Exposition’ takes us through a discussion of the autobiographical genre, analysing the texts against anumber of theorists, from Lejeune to Benjamin and Ricoeur, before launching into ‘Colette and Autobiography’. It argues pertinently that Colette did not write a ‘sustained’ autobiography, even inthe most autobiographical of her writings, Mes apprentissages. Measured against Goodwin’s three sources for autobiography, confession, apologia and memoirs, Colette’s autobiographical writings appear to be at odds with all of them. Freadman then goes on in Part II of her argument, to persuasively uncover a project that rejects self-scrutiny and with no autobiographical strategy. In ‘Collecting Time’, despite claims of continuity, narrative logic and causality areabandoned in favour of a collection offragments, family stories that are built up generation after generation into familylegends. A close and fruitful analysis of Sidoleads us to a study of ‘The Art of Ending’, concentrating on L’E ́toile Vesperandle Fanal Bleu. The closing chapter gives a fascinating reading of La Naissance du jouras an exemplar of the way in which the two subjects developed in Freadman’s volume are cast together:Colette’s own working through the autobiographical genre, and her refusal to write memoirs, in favour of collecting memories, and the strategies she uses for her purpose. In ‘Recapitulation’, her concluding chapter, Freadman adroitlyen capsulates her analysis in a fetching title: ‘Fables of Time’. Indeed, the wholepremise of her book is to move away from autobiographical genre, having acknowledged the links and debt the corpus owes to it, and into a study of the multiple and fruitful ways in which Colette tells time.The rich and varied readings of thematerial, competently informed by theoretical input, together with acute sensitivity to the corpus, mark out this study as incontournable for Colette scholars.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numerous expert elicitation methods have been suggested for generalised linear models (GLMs). This paper compares three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression. These methods were trialled on two experts in order to model the habitat suitability of the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The first elicitation approach is a geographically assisted indirect predictive method with a geographic information system (GIS) interface. The second approach is a predictive indirect method which uses an interactive graphical tool. The third method uses a questionnaire to elicit expert knowledge directly about the impact of a habitat variable on the response. Two variables (slope and aspect) are used to examine prior and posterior distributions of the three methods. The results indicate that there are some similarities and dissimilarities between the expert informed priors of the two experts formulated from the different approaches. The choice of elicitation method depends on the statistical knowledge of the expert, their mapping skills, time constraints, accessibility to experts and funding available. This trial reveals that expert knowledge can be important when modelling rare event data, such as threatened species, because experts can provide additional information that may not be represented in the dataset. However care must be taken with the way in which this information is elicited and formulated.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Many current chemistry programs privilege de-contextualised conceptual learning, often limited by a narrow selection of pedagogies that too often ignore the realities of studentse own lives and interests (e.g., Tytler, 2007). One new approach that offers hope for improving studentse engagement in learning chemistry and perceived relevance of chemistry is the context-based approach. This study investigated how teaching and learning occurred in one year 11 context-based chemistry classroom. Through an interpretive methodology using a case study design, the teaching and learning that occurred during one term (ten weeks) of a unit on Water Quality are described. The researcher was a participant observer in the study who co-designed the unit of work with the teacher. The research questions explored the structure and implementation of the context-based approach, the circumstances by which students connected concepts and context in the context-based classroom and the outcome of the approach for the students and the teacher. A dialectical sociocultural theoretical framework using the dialectics of structure | agency and agency | passivity was used as a lens to explore the interactions between learners in different fields, such as the field of the classroom and the field of the local community. The findings of this study highlight the difficulties teachers face when implementing a new pedagogical approach. Time constraints and opportunities for students to demonstrate a level of conceptual understanding that satisfied the teacher, hindered a full implementation of the approach. The study found that for high (above average) and sound (average) achieving students, connections between sanctioned science content of school curriculum and the studentse out-of-school worlds were realised when students actively engaged in fields that contextualised inquiry and gave them purpose for learning. Fluid transitions or the toing and froing between concepts and contexts occurred when structures in the classroom afforded students the agency to connect concepts and contexts. The implications for teaching by a context-based approach suggest that keeping the context central, by teaching content on a "need-to-know" basis, contextualises the chemistry for students. Also, if teachers provide opportunities for student-student interactions and written work student learning can improve.