983 resultados para Razón real (ratio rei)
Resumo:
A systematic literature review and a comprehensive meta-analysis that combines the findings from existing studies, was conducted in this thesis to analyse the impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to investigate the quality, publication bias and outlier bias of the various studies, and the time intervals used to measure traffic characteristics were considered. Based on this comprehensive and systematic review, and the results of the subsequent meta-analysis, major issues in study design, traffic and crash data, and model development and evaluation are discussed.
Resumo:
In current bridge management systems (BMSs), load and speed restrictions are applied on unhealthy bridges to keep the structure safe and serviceable for as long as possible. But the question is, whether applying these restrictions will always decrease the internal forces in critical components of the bridge and enhance the safety of the unhealthy bridges. To find the answer, this paper for the first time in literature, looks into the design aspects through studying the changes in demand by capacity ratios of the critical components of a bridge under the train loads. For this purpose, a structural model of a simply supported bridge, whose dynamic behaviour is similar to a group of real railway bridges, is developed. Demand by capacity ratios of the critical components of the bridge are calculated, to identify their sensitivity to increase of speed and magnitude of live load. The outcomes of this study are very significant as they show that, on the contrary to what is expected, by applying restriction on speed, the demand by capacity ratio of components may increase and make the bridge unsafe for carrying live load. Suggestions are made to solve the problem.
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
The Distributed Network Protocol v3.0 (DNP3) is one of the most widely used protocols, to control national infrastructure. Widely used interactive packet manipulation tools, such as Scapy, have not yet been augmented to parse and create DNP3 frames (Biondi 2014). In this paper we extend Scapy to include DNP3, thus allowing us to perform attacks on DNP3 in real-time. Our contribution builds on East et al. (2009), who proposed a range of possible attacks on DNP3. We implement several of these attacks to validate our DNP3 extension to Scapy, then executed the attacks on real world equipment. We present our results, showing that many of these theoretical attacks would be unsuccessful in an Ethernet-based network.
Resumo:
Corner detection has shown its great importance in many computer vision tasks. However, in real-world applications, noise in the image strongly affects the performance of corner detectors. Few corner detectors have been designed to be robust to heavy noise by now, partly because the noise could be reduced by a denoising procedure. In this paper, we present a corner detector that could find discriminative corners in images contaminated by noise of different levels, without any denoising procedure. Candidate corners (i.e., features) are firstly detected by a modified SUSAN approach, and then false corners in noise are rejected based on their local characteristics. Features in flat regions are removed based on their intensity centroid, and features on edge structures are removed using the Harris response. The detector is self-adaptive to noise since the image signal-to-noise ratio (SNR) is automatically estimated to choose an appropriate threshold for refining features. Experimental results show that our detector has better performance at locating discriminative corners in images with strong noise than other widely used corner or keypoint detectors.
Resumo:
The SBS/Blackfella Films production First Contact – that takes six non-Indigenous people and immerses them into Aboriginal Australia for the first time – captured the nation’s attention this week amassing a television audience nearing 1 million viewers, while the program’s Twitter hashtag #FirstContactSBS trended worldwide. Over the three episodes, we saw the participants get their “first contact” with Aboriginal Australia as they were welcomed into the homes of Aboriginal people in the city and in the bush...
Resumo:
Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.
Resumo:
Moral vitalism refers to a tendency to view good and evil as actual forces that can influence people and events. We introduce a scale designed to assess the belief in moral vitalism. High scorers on the scale endorse items such as “There are underlying forces of good and evil in this world”. After establishing the reliability and criterion validity of the scale (Studies 1, 2a, 2b), we examined the predictive validity of the moral vitalism scale, showing that “moral vitalists” worry about being possessed by evil (Study 3), being contaminated through contact with evil people (Study 4), and forfeiting their own mental purity (Study 5). We discuss the nature of moral vitalism and the implications of the construct for understanding the role of metaphysical lay theories about the nature of good and evil in moral reasoning.
Resumo:
The development of methods for real-time crash prediction as a function of current or recent traffic and roadway conditions is gaining increasing attention in the literature. Numerous studies have modeled the relationships between traffic characteristics and crash occurrence, and significant progress has been made. Given the accumulated evidence on this topic and the lack of an articulate summary of research status, challenges, and opportunities, there is an urgent need to scientifically review these studies and to synthesize the existing state-of-the-art knowledge. This paper addresses this need by undertaking a systematic literature review to identify current knowledge, challenges, and opportunities, and then conducts a meta-analysis of existing studies to provide a summary impact of traffic characteristics on crash occurrence. Sensitivity analyses were conducted to assess quality, publication bias, and outlier bias of the various studies; and the time intervals used to measure traffic characteristics were also considered. As a result of this comprehensive and systematic review, issues in study designs, traffic and crash data, and model development and validation are discussed. Outcomes of this study are intended to provide researchers focused on real-time crash prediction with greater insight into the modeling of this important but extremely challenging safety issue.
Resumo:
This paper describes a software architecture for real-world robotic applications. We discuss issues of software reliability, testing and realistic off-line simulation that allows the majority of the automation system to be tested off-line in the laboratory before deployment in the field. A recent project, the automation of a very large mining machine is used to illustrate the discussion.