368 resultados para TIMED
Resumo:
Current ultra-wideband communication systems use short narrow timed pulse sequences to transmit information. Some disadvantages of UWB communication systems are its interference of other conventional wireless systems and its reliance on time hopping schemes for multiple access. This paper presents a novel UWB data modulation scheme based on pulse shaping. This modulation scheme adds more flexibility for data modulation in UWB communication systems. The modulation scheme encodes data in both the timing and frequency spectrum of the transmitted pulse. This has the potential to improve data throughput rates and to lower interference between UWB and narrowband systems.
Resumo:
Modern distributed control systems comprise of a set of processors which are interconnected using a suitable communication network. For use in real-time control environments, such systems must be deterministic and generate specified responses within critical timing constraints. Also, they should be sufficiently robust to survive predictable events such as communication or processor faults. This thesis considers the problem of coordinating and synchronizing a distributed real-time control system under normal and abnormal conditions. Distributed control systems need to periodically coordinate the actions of several autonomous sites. Often the type of coordination required is the all or nothing property of an atomic action. Atomic commit protocols have been used to achieve this atomicity in distributed database systems which are not subject to deadlines. This thesis addresses the problem of applying time constraints to atomic commit protocols so that decisions can be made within a deadline. A modified protocol is proposed which is suitable for real-time applications. The thesis also addresses the problem of ensuring that atomicity is provided even if processor or communication failures occur. Previous work has considered the design of atomic commit protocols for use in non time critical distributed database systems. However, in a distributed real-time control system a fault must not allow stringent timing constraints to be violated. This thesis proposes commit protocols using synchronous communications which can be made resilient to a single processor or communication failure and still satisfy deadlines. Previous formal models used to design commit protocols have had adequate state coverability but have omitted timing properties. They also assumed that sites communicated asynchronously and omitted the communications from the model. Timed Petri nets are used in this thesis to specify and design the proposed protocols which are analysed for consistency and timeliness. Also the communication system is mcxielled within the Petri net specifications so that communication failures can be included in the analysis. Analysis of the Timed Petri net and the associated reachability tree is used to show the proposed protocols always terminate consistently and satisfy timing constraints. Finally the applications of this work are described. Two different types of applications are considered, real-time databases and real-time control systems. It is shown that it may be advantageous to use synchronous communications in distributed database systems, especially if predictable response times are required. Emphasis is given to the application of the developed commit protocols to real-time control systems. Using the same analysis techniques as those used for the design of the protocols it can be shown that the overall system performs as expected both functionally and temporally.
Resumo:
Hard real-time systems are a class of computer control systems that must react to demands of their environment by providing `correct' and timely responses. Since these systems are increasingly being used in systems with safety implications, it is crucial that they are designed and developed to operate in a correct manner. This thesis is concerned with developing formal techniques that allow the specification, verification and design of hard real-time systems. Formal techniques for hard real-time systems must be capable of capturing the system's functional and performance requirements, and previous work has proposed a number of techniques which range from the mathematically intensive to those with some mathematical content. This thesis develops formal techniques that contain both an informal and a formal component because it is considered that the informality provides ease of understanding and the formality allows precise specification and verification. Specifically, the combination of Petri nets and temporal logic is considered for the specification and verification of hard real-time systems. Approaches that combine Petri nets and temporal logic by allowing a consistent translation between each formalism are examined. Previously, such techniques have been applied to the formal analysis of concurrent systems. This thesis adapts these techniques for use in the modelling, design and formal analysis of hard real-time systems. The techniques are applied to the problem of specifying a controller for a high-speed manufacturing system. It is shown that they can be used to prove liveness and safety properties, including qualitative aspects of system performance. The problem of verifying quantitative real-time properties is addressed by developing a further technique which combines the formalisms of timed Petri nets and real-time temporal logic. A unifying feature of these techniques is the common temporal description of the Petri net. A common problem with Petri net based techniques is the complexity problems associated with generating the reachability graph. This thesis addresses this problem by using concurrency sets to generate a partial reachability graph pertaining to a particular state. These sets also allows each state to be checked for the presence of inconsistencies and hazards. The problem of designing a controller for the high-speed manufacturing system is also considered. The approach adopted mvolves the use of a model-based controller: This type of controller uses the Petri net models developed, thus preservIng the properties already proven of the controller. It. also contains a model of the physical system which is synchronised to the real application to provide timely responses. The various way of forming the synchronization between these processes is considered and the resulting nets are analysed using concurrency sets.
Resumo:
Sensorimotor synchronization is hypothesized to arise through two different processes, associated with continuous or discontinuous rhythmic movements. This study investigated synchronization of continuous and discontinuous movements to different pacing signals (auditory or visual), pacing interval (500, 650, 800, 950 ms) and across effectors (non-dominant vs. non-dominant hand). The results showed that mean and variability of asynchronization errors were consistently smaller for discontinuous movements compared to continuous movements. Furthermore, both movement types were timed more accurately with auditory pacing compared to visual pacing and were more accurate with the dominant hand. Shortening the pacing interval also improved sensorimotor synchronization accuracy in both continuous and discontinuous movements. These results show the dependency of temporal control of movements on the nature of the motor task, the type and rate of extrinsic sensory information as well as the efficiency of the motor actuators for sensory integration.
Resumo:
Bromocriptine is an ergot alkaloid dopamine D receptor agonist that has been used extensively in the past to treat hyperprolactinaemia, galactorrhoea and Parkinsonism. It is known that hypothalamic hypodopaminergic states and disturbed circadian rhythm are associated with the development of insulin resistance, obesity and diabetes in animals and humans. When administered in the early morning at the start of the light phase, a new quick release (QR) formulation of bromocriptine appears to act centrally to reset circadian rhythms of hypothalamic dopamine and serotonin and improve insulin resistance and other metabolic abnormalities. Phase II and III clinical studies show that QR-bromocriptine lowers glycated haemoglobin by 0.6-1.2% (7-13 mmol/mol) either as monotherapy or in combination with other antidiabetes medications. Apart from nausea, the drug is well tolerated. The doses used to treat diabetes (up to 4.8 mg daily) are much lower than those used to treat Parkinson's disease and have not been associated with retroperitoneal fibrosis or heart valve abnormalities. QR-bromocriptine (Cycloset™) has recently been approved in the USA for the treatment of type 2 diabetes mellitus (T2DM). Thus, a QR formulation of bromocriptine timed for peak delivery in the early morning may provide a novel neurally mediated approach to the control of hyperglycaemia in T2DM. © 2010 Blackwell Publishing Ltd.
Resumo:
We analyze detailed monthly data on U.S. open market stock repurchases (OMRs) that recently became available following stricter disclosure requirements. We find evidence that OMRs are timed to benefit non-selling shareholders. We present evidence that the profits to companies from timing repurchases are significantly related to ownership structure. Institutional ownership reduces companies' opportunities to repurchase stock at bargain prices. At low levels, insider ownership increases timing profits and at high levels it reduces them. Stock liquidity increases profits from timing OMRs.
Resumo:
Purpose – The purpose of this paper is to investigate the “last mile” delivery link between a hub and spoke distribution system and its customers. The proportion of retail, as opposed to non-retail (trade) customers using this type of distribution system has been growing in the UK. The paper shows the applicability of simulation to demonstrate changes in overall delivery policy to these customers. Design/methodology/approach – A case-based research method was chosen with the aim to provide an exemplar of practice and test the proposition that simulation can be used as a tool to investigate changes in delivery policy. Findings – The results indicate the potential improvement in delivery performance, specifically in meeting timed delivery performance, that could be made by having separate retail and non-retail delivery runs from the spoke terminal to the customer. Research limitations/implications – The simulation study does not attempt to generate a vehicle routing schedule but demonstrates the effects of a change on delivery performance when comparing delivery policies. Practical implications – Scheduling and spreadsheet software are widely used and provide useful assistance in the design of delivery runs and the allocation of staff to those delivery runs. This paper demonstrates to managers the usefulness of investigating the efficacy of current design rules and presents simulation as a suitable tool for this analysis. Originality/value – A simulation model is used in a novel application to test a change in delivery policy in response to a changing delivery profile of increased retail deliveries.
Resumo:
In this paper the key features of a two-layered model for describing the semantic of dynamical web resources are introduced. In the current Semantic Web proposal [Berners-Lee et al., 2001] web resources are classified into static ontologies which describes the semantic network of their inter-relationships [Kalianpur, 2001][Handschuh & Staab, 2002] and complex constraints described by logical quantified formula [Boley et al., 2001][McGuinnes & van Harmelen, 2004][McGuinnes et al., 2004], the basic idea is that software agents can use techniques of automatic reasoning in order to relate resources and to support sophisticated web application. On the other hand, web resources are also characterized by their dynamical aspects, which are not adequately addressed by current web models. Resources on the web are dynamical since, in the minimal case, they can appear or disappear from the web and their content is upgraded. In addition, resources can traverse different states, which characterized the resource life-cycle, each resource state corresponding to different possible uses of the resource. Finally most resources are timed, i.e. they information they provide make sense only if contextualised with respect to time, and their validity and accuracy is greatly bounded by time. Temporal projection and deduction based on dynamical and time constraints of the resources can be made and exploited by software agents [Hendler, 2001] in order to make previsions about the availability and the state of a resource, for deciding when consulting the resource itself or in order to deliberately induce a resource state change for reaching some agent goal, such as in the automated planning framework [Fikes & Nilsson, 1971][Bacchus & Kabanza,1998].
Resumo:
Real-time systems are usually modelled with timed automata and real-time requirements relating to the state durations of the system are often specifiable using Linear Duration Invariants, which is a decidable subclass of Duration Calculus formulas. Various algorithms have been developed to check timed automata or real-time automata for linear duration invariants, but each needs complicated preprocessing and exponential calculation. To the best of our knowledge, these algorithms have not been implemented. In this paper, we present an approximate model checking technique based on a genetic algorithm to check real-time automata for linear durration invariants in reasonable times. Genetic algorithm is a good optimization method when a problem needs massive computation and it works particularly well in our case because the fitness function which is derived from the linear duration invariant is linear. ACM Computing Classification System (1998): D.2.4, C.3.
Resumo:
As researchers and practitioners move towards a vision of software systems that configure, optimize, protect, and heal themselves, they must also consider the implications of such self-management activities on software reliability. Autonomic computing (AC) describes a new generation of software systems that are characterized by dynamically adaptive self-management features. During dynamic adaptation, autonomic systems modify their own structure and/or behavior in response to environmental changes. Adaptation can result in new system configurations and capabilities, which need to be validated at runtime to prevent costly system failures. However, although the pioneers of AC recognize that validating autonomic systems is critical to the success of the paradigm, the architectural blueprint for AC does not provide a workflow or supporting design models for runtime testing. ^ This dissertation presents a novel approach for seamlessly integrating runtime testing into autonomic software. The approach introduces an implicit self-test feature into autonomic software by tailoring the existing self-management infrastructure to runtime testing. Autonomic self-testing facilitates activities such as test execution, code coverage analysis, timed test performance, and post-test evaluation. In addition, the approach is supported by automated testing tools, and a detailed design methodology. A case study that incorporates self-testing into three autonomic applications is also presented. The findings of the study reveal that autonomic self-testing provides a flexible approach for building safe, reliable autonomic software, while limiting the development and performance overhead through software reuse. ^
Resumo:
The real-time embedded systems design requires precise control of the passage of time in the computation performed by the modules and communication between them. Generally, these systems consist of several modules, each designed for a specific task and restricted communication with other modules in order to obtain the required timing. This strategy, called federated architecture, is already becoming unviable in front of the current demands of cost, required performance and quality of embedded system. To address this problem, it has been proposed the use of integrated architectures that consist of one or few circuits performing multiple tasks in parallel in a more efficient manner and with reduced costs. However, one has to ensure that the integrated architecture has temporal composability, ie the ability to design each task temporally isolated from the others in order to maintain the individual characteristics of each task. The Precision Timed Machines are an integrated architecture approach that makes use of multithreaded processors to ensure temporal composability. Thus, this work presents the implementation of a Precision Machine Timed named Hivek-RT. This processor which is a VLIW supporting Simultaneous Multithreading is capable of efficiently execute real-time tasks when compared to a traditional processor. In addition to the efficient implementation, the proposed architecture facilitates the implementation real-time tasks from a programming point of view.
Resumo:
Introduction: Gait after stroke is characterized by a significant asymmetry between the lower limbs, with predominant use of the non-paretic lower limb (NPLL) over using the paretic lower limb. Accordingly, it has been suggested that adding load/weight to the NPLL as a form of restricting the movement of this limb may favor the use of the paretic limb, reducing interlimb asymmetry. However, few studies have been conducted up to this moment, which only investigated the immediate effects of this practice. Objectives: 1) Investigating whether there is an influence of adding load to the NPLL during treadmill training on cardiovascular parameters and on gait performance of individuals with stroke, compared to treadmill training without load addition; 2) Analyzing the effects of treadmill training with and without load added to the NPLL on kinematic parameters of each lower limb during gait; 3) Analyzing the effects of treadmill training with and without load added to the NPLL on measurements of functional mobility and postural balance of these patients. Materials and Methods: This is a randomized single blinded clinical trial involving 38 subjects, with a mean age of 56.5 years, at the subacute post-stroke phase (with mean time since stroke of 4.5 months). Participants were randomly assigned into an experimental group (EG) or control group (CG). EG (n= 19) was submitted to gait training on a treadmill with the addition of load to the NPLL by ankle weights equivalent to 5% of body weight. CG (n= 19) was only submitted to gait training on a treadmill. Behavioral strategies which included home exercises were also applied to both groups. The interventions occurred daily for two consecutive weeks (Day 1 to Day 9), being of 30 minutes duration each. Outcome measures: postural balance (Berg Functional Balance Scale – BBS), functional mobility (Timed Up and Go – TUG; kinematic variables of 180° turning) and kinematic gait variables were assessed at baseline (Day 0), after four training sessions (Day 4), after nine training sessions (Day 9), and 40 days after completion of training (Follow-up). Cardiovascular parameters (mean arterial pressure and heart rate) were evaluated at four moments within each training session. Analysis of variance (ANOVA) was used to compare outcomes between EG and CG in the course of the study (Day 0, Day 4, Day 9 and Follow-up). Unpaired t-tests allowed for intergroup comparison at each training session. 5% significance was used for all tests. Results: 1) Cardiovascular parameters (systemic arterial pressure, heart rate and derivated variables) did not change after the interventions and there were no differences between groups within each training session. There was an improvement in gait performance, with increased speed and distance covered, with no statistically significant difference between groups. 2) After the interventions, patients had increased paretic and non-paretic step lengths, in addition to exhibiting greater hip and knee joint excursion on both lower limbs. The gains were observed in the EG and CG, with no statistical difference between the groups and (mostly) maintained at follow-up. 3) After the interventions, patients showed better postural balance (higher scores on BBS) and functional mobility (reduced time spent on the TUG test and better performance on the 180° turning). All gains were observed in the EG and CG, with no statistically significant difference between groups and were maintained at follow-up. Conclusions: The addition of load to the NPLL did not affect cardiovascular parameters in patients with subacute stroke, similar to treadmill training without load, thus seemingly a safe training to be applied to these patients. However, the use of the load did not bring any additional benefits to gait training. The gait training program (nine training sessions on a treadmill + strategies and exercises for paretic limb stimulation) was useful for improving gait performance and kinematics, functional mobility and postural balance, and its use is suggested to promote the optimization of these outcomes in the subacute phase after stroke.
Resumo:
The objective was to evaluate the effect of lactation order, racial composition and milk production in the body condition score (BCS) at prepartum and its variation at postpartum. Furthermore, evaluate the effect of BCS at prepartum and its variation at postpartum on reproductive performance in dairy cows. Data was collected, relating to 470 parturitions for two years at 3 properties in Gurinhatã-MG. Milk production was measured monthly and the evaluation of the BCS was made by a single individual in the prepartum and postpartum (from 1.0 to 5.0). Was used the conventional artificial insemination, timed artificial insemination and controlled ride. The pregnancy diagnosis was through rectal palpation from 40 days after the service. The variables were analyzed using the SAS GLIMMIX procedure. The racial composition affected the BCS at prepartum (P=0.0003). Milk production tended to affect the BCS at prepartum (P=0.0957) and its variation in postpartum (P=0.1179). The overall conception rate was 57.3% and was affected (P<0.0001) by type of service. There was no effect of the BCS in prepartum (P=0.1544) and the variation of BCS (P=0.3127) on conception rate. Had no effect of BCS interaction at prepartum (P=0.9516) and the variation of BCS (P=0.9506) with the type of service on conception rate. The BCS at prepartum affect the service period (P<0.0001). Cows with BCS less than 3.25 became pregnant earlier. The variation of the BCS affected the service period (P<0.0001). Cows with loss of ECC became pregnant earlier than cows without loss. The average loss of ECC at postpartum was -0.692 points, not enough to damage the reproductive performance of dairy cows.
Resumo:
This project proposes two possible solutions for the phasing plan of the intersection near the City Hall of Leiria and presents the calculations of the cycle length and the intersection delay for both of them. The main goal of these solutions is to optimize the global functioning of the intersection. Since the number of cars that use an intersection is will fluctuate with time, when using pre-timed traffic lights, adjustments are needed to the settings of the traffic signals, to assure the accommodation of the present traffic flows in the intersection under acceptable conditions for drivers.
Resumo:
The current study builds upon a previous study, which examined the degree to which the lexical properties of students’ essays could predict their vocabulary scores. We expand on this previous research by incorporating new natural language processing indices related to both the surface- and discourse-levels of students’ essays. Additionally, we investigate the degree to which these NLP indices can be used to account for variance in students’ reading comprehension skills. We calculated linguistic essay features using our framework, ReaderBench, which is an automated text analysis tools that calculates indices related to linguistic and rhetorical features of text. University students (n = 108) produced timed (25 minutes), argumentative essays, which were then analyzed by ReaderBench. Additionally, they completed the Gates-MacGinitie Vocabulary and Reading comprehension tests. The results of this study indicated that two indices were able to account for 32.4% of the variance in vocabulary scores and 31.6% of the variance in reading comprehension scores. Follow-up analyses revealed that these models further improved when only considering essays that contained multiple paragraph (R2 values = .61 and .49, respectively). Overall, the results of the current study suggest that natural language processing techniques can help to inform models of individual differences among student writers.