576 resultados para Show da Fé
Resumo:
Open access reforms to railway regulations allow multiple train operators to provide rail services on a common infrastructure. As railway operations are now independently managed by different stakeholders, conflicts in operations may arise, and there have been attempts to derive an effective access charge regime so that these conflicts may be resolved. One approach is by direct negotiation between the infrastructure manager and the train service providers. Despite the substantial literature on the topic, few consider the benefits of employing computer simulation as an evaluation tool for railway operational activities such as access pricing. This article proposes a multi-agent system (MAS) framework for the railway open market and demonstrates its feasibility by modelling the negotiation between an infrastructure provider and a train service operator. Empirical results show that the model is capable of resolving operational conflicts according to market demand.
Resumo:
The MPEG-21 Multimedia Framework provides for controlled distribution of multimedia works through its Intellectual Property Management and Protection ("IPMP") Components and Rights Expression Language ("MPEG REL"). The IPMP Components provide a framework by which the components of an MPEG-21 digital item can be protected from undesired access, while MPEG REL provides a mechanism for describing the conditions under which a component of a digital item may be used and distributed. This chapter describes how the IPMP Components and MPEG REL were used to implement a series of digital rights management applications at the Cooperative Research Centre for Smart Internet Technology in Australia. While the IPMP Components and MPEG REL were initially designed to facilitate the protection of copyright, the applications also show how the technology can be adapted to the protection of private personal information and sensitive corporate information.
Resumo:
This paper addresses the problem of constructing consolidated business process models out of collections of process models that share common fragments. The paper considers the construction of unions of multiple models (called merged models) as well as intersections (called digests). Merged models are intended for analysts who wish to create a model that subsumes a collection of process models - typically representing variants of the same underlying process - with the aim of replacing the variants with the merged model. Digests, on the other hand, are intended for analysts who wish to identify the most recurring fragments across a collection of process models, so that they can focus their efforts on optimizing these fragments. The paper presents an algorithm for computing merged models and an algorithm for extracting digests from a merged model. The merging and digest extraction algorithms have been implemented and tested against collections of process models taken from multiple application domains. The tests show that the merging algorithm produces compact models and scales up to process models containing hundreds of nodes. Furthermore, a case study conducted in a large insurance company has demonstrated the usefulness of the merging and digest extraction operators in a practical setting.
Resumo:
Survival probability prediction using covariate-based hazard approach is a known statistical methodology in engineering asset health management. We have previously reported the semi-parametric Explicit Hazard Model (EHM) which incorporates three types of information: population characteristics; condition indicators; and operating environment indicators for hazard prediction. This model assumes the baseline hazard has the form of the Weibull distribution. To avoid this assumption, this paper presents the non-parametric EHM which is a distribution-free covariate-based hazard model. In this paper, an application of the non-parametric EHM is demonstrated via a case study. In this case study, survival probabilities of a set of resistance elements using the non-parametric EHM are compared with the Weibull proportional hazard model and traditional Weibull model. The results show that the non-parametric EHM can effectively predict asset life using the condition indicator, operating environment indicator, and failure history.
Resumo:
Robustness of the track allocation problem is rarely addressed in literatures and the obtained track allocation schemes (TAS) embody some bottlenecks. Therefore, an approach to detect bottlenecks is needed to support local optimization. First a TAS is transformed to an executable model by Petri nets. Then disturbances analysis is performed using the model and the indicators of the total trains' departure delays are collected to detect bottlenecks when each train suffers a disturbance. Finally, the results of the tests based on a rail hub linking six lines and a TAS about thirty minutes show that the minimum buffer time is 21 seconds and there are two bottlenecks where the buffer times are 57 and 44 seconds respectively, and it indicates that the bottlenecks do not certainly locate at the area where there is minimum buffer time. The proposed approach can further support selection of multi schemes and robustness optimization.
Resumo:
As the use of renewable energy sources (RESs) increases worldwide, there is a rising interest on their impacts on power system operation and control. An overview of the key issues and new challenges on frequency regulation concerning the integration of renewable energy units into the power systems is presented. Following a brief survey on the existing challenges and recent developments, the impact of power fluctuation produced by variable renewable sources (such as wind and solar units) on sysstem frequency performance is also presented. An updated LFC model is introduced, and power system frequency response in the presence of RESs and associated issues is analysed. The need for the revising of frequency performance standards is emphasised. Finally, non-linear time-domain simulations on the standard 39-bus and 24-bus test systems show that the simulated results agree with those predicted analytically.
Resumo:
This paper presents a continuous isotropic spherical omnidirectional drive mechanism that is efficient in its mechanical simplicity and use of volume. Spherical omnidirectional mechanisms allow isotropic motion, although many are limited from achieving true isotropic motion by practical mechanical design considerations. The mechanism presented in this paper uses a single motor to drive a point on the great circle of the sphere parallel to the ground plane, and does not require a gearbox. Three mechanisms located 120 degrees apart provide a stable drive platform for a mobile robot. Results show the omnidirectional ability of the robot and demonstrate the performance of the spherical mechanism compared to a popular commercial omnidirectional wheel over edges of varying heights and gaps of varying widths.
Resumo:
Keizer, Lindenberg and Steg (2008) conduct six interesting field experiments and report that their results provide evidence of the broken windows theory. Such an analysis is highly relevant as the (broken windows) theory is both controversial and lacking empirical support. Keizer et al.s key aim was to conceptualize a disorderly setting in such a way that it is linked to a process of spreading norm violation. The strength of the study is the exploration of cross-norm inhibition effects in a controlled field experimental environment. Their results show that if norm violating behavior becomes more common, it negatively affects compliance in other areas. Nevertheless, this comment paper discusses several shortcomings or limitations and provides new empirical evidence that deals with these problems.
Resumo:
Fibre Bragg Grating (FBG) sensors have been installed along an existing line for the purposes of train detection and weight measurement. The results show fair accuracy and high resolution on the vertical force acted on track when the train wheels are rolling upon. While the sensors are already in place and data is available, further applications beyond train detection are explored. This study presents the analysis on the unique signatures from the data collected to characterise wheel-rail interaction for rail defect detection. Focus of this first stage of work is placed on the repeatability of signals from the same wheel-rail interactions while the rail is in healthy state. Discussions on the preliminary results and hence the feasibility of this condition monitoring application, as well as technical issues to be addressed in practice, are given.
Resumo:
A schedule coordination problem involving two train services provided by different operators is modeled as an optimization of revenue intake. The coordination is achieved through the adjustment of commencement times of the train services by negotiation. The problem is subject to constraints regarding to passenger demands and idle costs of rolling-stocks from both operators. This paper models the operators as software agents having the flexibility to incorporate one of the two (and potentially more) proposed negotiation strategies. Empirical results show that agents employing different combination of strategies have significant impact on the quality of solution and negotiation time.
Resumo:
The thermal decomposition of halloysite-potassium acetate intercalation compound was investigated by thermogravimetric analysis and infrared emission spectroscopy. The X-ray diffraction patterns indicated that intercalation of potassium acetate into halloysite caused an increase of the basal spacing from 1.00 to 1.41 nm. The thermogravimetry results show that the mass losses of intercalation the compound occur in main three main steps, which correspond to (a) the loss of adsorbed water (b) the loss of coordination water and (c) the loss of potassium acetate and dehydroxylation. The temperature of dehydroxylation and dehydration of halloysite is decreased about 100 C. The infrared emission spectra clearly show the decomposition and dehydroxylation of the halloysite intercalation compound when the temperature is raised. The dehydration of the intercalation compound is followed by the loss of intensity of the stretching vibration bands at region 3600-3200 cm-1. Dehydroxylation is followed by the decrease in intensity in the bands between 3695 and 3620 cm-1. Dehydration was completed by 300 C and partial dehydroxylation by 350 C. The inner hydroxyl group remained until around 500 C.
Resumo:
With the recent regulatory reforms in a number of countries, railways resources are no longer managed by a single party but are distributed among different stakeholders. To facilitate the operation of train services, a train service provider (SP) has to negotiate with the infrastructure provider (IP) for a train schedule and the associated track access charge. This paper models the SP and IP as software agents and the negotiation as a prioritized fuzzy constraint satisfaction (PFCS) problem. Computer simulations have been conducted to demonstrate the effects on the train schedule when the SP has different optimization criteria. The results show that by assigning different priorities on the fuzzy constraints, agents can represent SPs with different operational objectives.
Resumo:
This paper argues, somewhat along a Simmelian line, that political theory may produce practical and universal theories like those developed in theoretical physics. The reasoning behind this paper is to show that the theory of basic democracy may be true by way of comparing it to Einsteins Special Relativity specifically concerning the parameters of symmetry, unification, simplicity, and utility. These parameters are what make a theory in physics as meeting them not only fits with current knowledge, but also produces paths towards testing (application). As the theory of basic democracy may meet these same parameters, it could settle the debate concerning the definition of democracy. This will be argued firstly by discussing what the theory of basic democracy is and why it differs from previous work; secondly by explaining the parameters chosen (as in why these and not others confirm or scuttle theories); and thirdly by comparing how Special Relativity and the theory of basic democracy may match the parameters.
Resumo:
Purpose: To compare subjective blur limits for cylinder and defocus. ---------- Method: Blur was induced with a deformable, adaptive-optics mirror when either the subjects own astigmatisms were corrected or when both astigmatisms and higher-order aberrations were corrected. Subjects were cyclopleged and had 5 mm artificial pupils. Black letter targets (0.1, 0.35 and 0.6 logMAR) were presented on white backgrounds. Results: For ten subjects, blur limits were approximately 50% greater for cylinder than for defocus (in diopters). While there were considerable effects of axis for individuals, overall this was not strong, with the 0 (or 180) axis having about 20% greater limits than oblique axes. In a second experiment with text (equivalent in angle to N10 print at 40 cm distance), cylinder blur limits for 6 subjects were approximately 30% greater than those for defocus; this percentage was slightly smaller than for the three letters. Blur limits of the text were intermediate between those of 0.35 logMAR and 0.6 logMAR letters. Extensive blur limit measurements for one subject with single letters did not show expected interactions between target detail orientation and cylinder axis. ---------- Conclusion: Subjective blur limits for cylinder are 30%-50% greater than those for defocus, with the overall influence of cylinder axis being 20%.
Resumo:
Purpose: Flickering stimuli increase the metabolic demand of the retina,making it a sensitive perimetric stimulus to the early onset of retinal disease. We determine whether flickering stimuli are a sensitive indicator of vision deficits resulting from to acute, mild systemic hypoxia when compared to standard static perimetry. Methods: Static and flicker visual perimetry were performed in 14 healthy young participants while breathing 12% oxygen (hypoxia) under photopic illumination. The hypoxia visual field data were compared with the field data measured during normoxia. Absolute sensitivities (in dB) were analysed in seven concentric rings at 1, 3, 6, 10, 15, 22 and 30 eccentricities as well as mean defect (MD) and pattern defect (PD) were calculated. Preliminary data are reported for mesopic light levels. Results: Under photopic illumination, flicker and static visual field sensitivities at all eccentricities were not significantly different between hypoxia and normoxia conditions. The mean defect and pattern defect were not significantly different for either test between the two oxygenation conditions. Conclusion: Although flicker stimulation increases cellular metabolism, flicker photopic visual field impairment is not detected during mild hypoxia. These findings contrast with electrophysiological flicker tests in young participants that show impairment at photopic illumination during the same levels of mild hypoxia. Potential mechanisms contributing to the difference between the visual fields and electrophysiological flicker tests including variability in perimetric data, neuronal adaptation and vascular autoregulation, are considered. The data have implications for the use of visual perimetry in the detection of ischaemic/hypoxic retinal disorders under photopic and mesopic light levels.