996 resultados para real impedance generator
Resumo:
Popular wireless network standards, such as IEEE 802.11/15/16, are increasingly adopted in real-time control systems. However, they are not designed for real-time applications. Therefore, the performance of such wireless networks needs to be carefully evaluated before the systems are implemented and deployed. While efforts have been made to model general wireless networks with completely random traffic generation, there is a lack of theoretical investigations into the modelling of wireless networks with periodic real-time traffic. Considering the widely used IEEE 802.11 standard, with the focus on its distributed coordination function (DCF), for soft-real-time control applications, this paper develops an analytical Markov model to quantitatively evaluate the network quality-of-service (QoS) performance in periodic real-time traffic environments. Performance indices to be evaluated include throughput capacity, transmission delay and packet loss ratio, which are crucial for real-time QoS guarantee in real-time control applications. They are derived under the critical real-time traffic condition, which is formally defined in this paper to characterize the marginal satisfaction of real-time performance constraints.
Resumo:
Abstract Computer simulation is a versatile and commonly used tool for the design and evaluation of systems with different degrees of complexity. Power distribution systems and electric railway network are areas for which computer simulations are being heavily applied. A dominant factor in evaluating the performance of a software simulator is its processing time, especially in the cases of real-time simulation. Parallel processing provides a viable mean to reduce the computing time and is therefore suitable for building real-time simulators. In this paper, we present different issues related to solving the power distribution system with parallel computing based on a multiple-CPU server and we will concentrate, in particular, on the speedup performance of such an approach.
Resumo:
Purpose: The purpose of this paper is to examine the impact of globalisation on corporate real estate strategies. Specifically, it seeks to identify corporate real estate capabilities that are important in a hypercompetitive business climate. ---------- Design/methodology/approach: This paper utilises a qualitative approach to analyse secondary data in order to identify the corporate real estate capabilities for a hypercompetitive business environment. ---------- Findings: Globalisation today is an undeniable phenomenon that is fundamentally changing the way business is conducted. In the light of global hypercompetition, corporate real estate needs to develop new capabilities to support global business strategies. These include flexibility, network organization and managerial learning capabilities. ---------- Research limitations/implications: This is a conceptual paper and future empirical research needs to be conducted to verify the propositions made in this paper. ---------- Practical implications: Given the new level of uncertainty in the business climate, that is, hypercompetition, businesses need to develop dynamic capabilities that are harder for competitors to imitate in order to maintain what is considered a “momentary” competitive advantage. The findings of this paper are useful to guide corporate real estate managers in this regard. ---------- Originality/value:– This paper is original in two ways. First, it applies the strategic management concept of capabilities to corporate real estate. Second, it links the key challenge that businesses face today, i.e. globalisation, to the concept of capabilities as a means to maintain competitive advantage.
Resumo:
The authors currently engage in two projects to improve human-computer interaction (HCI) designs that can help conserve resources. The projects explore motivation and persuasion strategies relevant to ubiquitous computing systems that bring real-time consumption data into the homes and hands of residents in Brisbane, Australia. The first project seeks to increase understanding among university staff of the tangible and negative effects that excessive printing has on the workplace and local environment. The second project seeks to shift attitudes toward domestic energy conservation through software and hardware that monitor real-time, in situ electricity consumption in homes across Queensland. The insights drawn from these projects will help develop resource consumption user archetypes, providing a framework linking people to differing interface design requirements.
Resumo:
Nonlinear filter generators are common components used in the keystream generators for stream ciphers and more recently for authentication mechanisms. They consist of a Linear Feedback Shift Register (LFSR) and a nonlinear Boolean function to mask the linearity of the LFSR output. Properties of the output of a nonlinear filter are not well studied. Anderson noted that the m-tuple output of a nonlinear filter with consecutive taps to the filter function is unevenly distributed. Current designs use taps which are not consecutive. We examine m-tuple outputs from nonlinear filter generators constructed using various LFSRs and Boolean functions for both consecutive and uneven (full positive difference sets where possible) tap positions. The investigation reveals that in both cases, the m-tuple output is not uniform. However, consecutive tap positions result in a more biased distribution than uneven tap positions, with some m-tuples not occurring at all. These biased distributions indicate a potential flaw that could be exploited for cryptanalysis
Resumo:
This paper anatomises emerging developments in online community engagement in a major global industry: real estate. Economists argue that we are entering a ‘social network economy’ in which ‘complex social networks’ govern consumer choice and product value. In the light of this, organisations are shifting from thinking and behaving in the conventional ‘value chain’ model--in which exchanges between firms and customers are one-way only, from the firm to the consumer--to the ‘value ecology’ model, in which consumers and their networks become co-creators of the value of the product. This paper studies the way in which the global real estate industry is responding to this environment. This paper identifies three key areas in which online real estate ‘value ecology’ work is occurring: real estate social networks, games, and locative media / augmented reality applications. Uptake of real estate applications is, of course, user-driven: the paper not only highlights emerging innovations; it also identifies which of these innovations are actually being taken up by users, and the content contributed as a result. The paper thus provides a case study of one major industry’s shift into a web 2.0 communication model, focusing on emerging trends and issues.
Resumo:
The emergence of mobile and ubiquitous computing technology has created what is often referred to as the hybrid space – a virtual layer of digital information and interaction opportunities that sit on top of and augment the physical environment. Embodied media materialise digital information as observable and sometimes interactive parts of the physical environment. The aim of this work is to explore ways to enhance people’s situated real world experience, and to find out what the role and impact of embodied media in achieving this goal can be. The Edge, an initiative of the State Library of Queensland in Brisbane, Australia, and case study of this thesis, envisions to be a physical place for people to meet, explore, experience, learn and teach each other creative practices in various areas related to digital technology and arts. Guided by an Action Research approach, this work applies Lefebvre’s triad of space (1991) to investigate the Edge as a social space from a conceived, perceived and lived point of view. Based on its creators’ vision and goals on the conceived level, different embodied media are iteratively designed, implemented and evaluated towards shaping and amplifying the Edge’s visitor experience on the perceived and lived level.
Resumo:
A number of advanced driver assistance systems (ADAS) are currently being released on the market, providing safety functions to the drivers such as collision avoidance, adaptive cruise control or enhanced night-vision. These systems however are inherently limited by their sensory range: they cannot gather information from outside this range, also called their “perceptive horizon”. Cooperative systems are a developing research avenue that aims at providing extended safety and comfort functionalities by introducing vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) wireless communications to the road actors. This paper presents the problematic of cooperative systems, their advantages and contributions to road safety and exposes some limitations related to market penetration, sensors accuracy and communications scalability. It explains the issues of how to implement extended perception, a central contribution of cooperative systems. The initial steps of an evaluation of data fusion architectures for extended perception are exposed.
Resumo:
Impedance cardiography is an application of bioimpedance analysis primarily used in a research setting to determine cardiac output. It is a non invasive technique that measures the change in the impedance of the thorax which is attributed to the ejection of a volume of blood from the heart. The cardiac output is calculated from the measured impedance using the parallel conductor theory and a constant value for the resistivity of blood. However, the resistivity of blood has been shown to be velocity dependent due to changes in the orientation of red blood cells induced by changing shear forces during flow. The overall goal of this thesis was to study the effect that flow deviations have on the electrical impedance of blood, both experimentally and theoretically, and to apply the results to a clinical setting. The resistivity of stationary blood is isotropic as the red blood cells are randomly orientated due to Brownian motion. In the case of blood flowing through rigid tubes, the resistivity is anisotropic due to the biconcave discoidal shape and orientation of the cells. The generation of shear forces across the width of the tube during flow causes the cells to align with the minimal cross sectional area facing the direction of flow. This is in order to minimise the shear stress experienced by the cells. This in turn results in a larger cross sectional area of plasma and a reduction in the resistivity of the blood as the flow increases. Understanding the contribution of this effect on the thoracic impedance change is a vital step in achieving clinical acceptance of impedance cardiography. Published literature investigates the resistivity variations for constant blood flow. In this case, the shear forces are constant and the impedance remains constant during flow at a magnitude which is less than that for stationary blood. The research presented in this thesis, however, investigates the variations in resistivity of blood during pulsataile flow through rigid tubes and the relationship between impedance, velocity and acceleration. Using rigid tubes isolates the impedance change to variations associated with changes in cell orientation only. The implications of red blood cell orientation changes for clinical impedance cardiography were also explored. This was achieved through measurement and analysis of the experimental impedance of pulsatile blood flowing through rigid tubes in a mock circulatory system. A novel theoretical model including cell orientation dynamics was developed for the impedance of pulsatile blood through rigid tubes. The impedance of flowing blood was theoretically calculated using analytical methods for flow through straight tubes and the numerical Lattice Boltzmann method for flow through complex geometries such as aortic valve stenosis. The result of the analytical theoretical model was compared to the experimental impedance measurements through rigid tubes. The impedance calculated for flow through a stenosis using the Lattice Boltzmann method provides results for comparison with impedance cardiography measurements collected as part of a pilot clinical trial to assess the suitability of using bioimpedance techniques to assess the presence of aortic stenosis. The experimental and theoretical impedance of blood was shown to inversely follow the blood velocity during pulsatile flow with a correlation of -0.72 and -0.74 respectively. The results for both the experimental and theoretical investigations demonstrate that the acceleration of the blood is an important factor in determining the impedance, in addition to the velocity. During acceleration, the relationship between impedance and velocity is linear (r2 = 0.98, experimental and r2 = 0.94, theoretical). The relationship between the impedance and velocity during the deceleration phase is characterised by a time decay constant, ô , ranging from 10 to 50 s. The high level of agreement between the experimental and theoretically modelled impedance demonstrates the accuracy of the model developed here. An increase in the haematocrit of the blood resulted in an increase in the magnitude of the impedance change due to changes in the orientation of red blood cells. The time decay constant was shown to decrease linearly with the haematocrit for both experimental and theoretical results, although the slope of this decrease was larger in the experimental case. The radius of the tube influences the experimental and theoretical impedance given the same velocity of flow. However, when the velocity was divided by the radius of the tube (labelled the reduced average velocity) the impedance response was the same for two experimental tubes with equivalent reduced average velocity but with different radii. The temperature of the blood was also shown to affect the impedance with the impedance decreasing as the temperature increased. These results are the first published for the impedance of pulsatile blood. The experimental impedance change measured orthogonal to the direction of flow is in the opposite direction to that measured in the direction of flow. These results indicate that the impedance of blood flowing through rigid cylindrical tubes is axisymmetric along the radius. This has not previously been verified experimentally. Time frequency analysis of the experimental results demonstrated that the measured impedance contains the same frequency components occuring at the same time point in the cycle as the velocity signal contains. This suggests that the impedance contains many of the fluctuations of the velocity signal. Application of a theoretical steady flow model to pulsatile flow presented here has verified that the steady flow model is not adequate in calculating the impedance of pulsatile blood flow. The success of the new theoretical model over the steady flow model demonstrates that the velocity profile is important in determining the impedance of pulsatile blood. The clinical application of the impedance of blood flow through a stenosis was theoretically modelled using the Lattice Boltzman method (LBM) for fluid flow through complex geometeries. The impedance of blood exiting a narrow orifice was calculated for varying degrees of stenosis. Clincial impedance cardiography measurements were also recorded for both aortic valvular stenosis patients (n = 4) and control subjects (n = 4) with structurally normal hearts. This pilot trial was used to corroborate the results of the LBM. Results from both investigations showed that the decay time constant for impedance has potential in the assessment of aortic valve stenosis. In the theoretically modelled case (LBM results), the decay time constant increased with an increase in the degree of stenosis. The clinical results also showed a statistically significant difference in time decay constant between control and test subjects (P = 0.03). The time decay constant calculated for test subjects (ô = 180 - 250 s) is consistently larger than that determined for control subjects (ô = 50 - 130 s). This difference is thought to be due to difference in the orientation response of the cells as blood flows through the stenosis. Such a non-invasive technique using the time decay constant for screening of aortic stenosis provides additional information to that currently given by impedance cardiography techniques and improves the value of the device to practitioners. However, the results still need to be verified in a larger study. While impedance cardiography has not been widely adopted clinically, it is research such as this that will enable future acceptance of the method.
Resumo:
The main objective of this paper is to detail the development of a feasible hardware design based on Evolutionary Algorithms (EAs) to determine flight path planning for Unmanned Aerial Vehicles (UAVs) navigating terrain with obstacle boundaries. The design architecture includes the hardware implementation of Light Detection And Ranging (LiDAR) terrain and EA population memories within the hardware, as well as the EA search and evaluation algorithms used in the optimizing stage of path planning. A synthesisable Very-high-speed integrated circuit Hardware Description Language (VHDL) implementation of the design was developed, for realisation on a Field Programmable Gate Array (FPGA) platform. Simulation results show significant speedup compared with an equivalent software implementation written in C++, suggesting that the present approach is well suited for UAV real-time path planning applications.
Resumo:
The Queensland University of Technology badges itself as “a university for the real world”. For the last decade the Law Faculty has aimed to provide its students with a ‘real world’ degree, that is, a practical law degree. This has seen skills such as research, advocacy and negotiation incorporated into the undergraduate degree under a University Teaching & Learning grant, a project that gained international recognition and praise. In 2007–2008 the Law Faculty undertook another curriculum review of its undergraduate law degree. As a result of the two year review, QUT’s undergraduate lawdegree has fewer core units, a focus on first year student transition, scaffolding of law graduate capabilities throughout the degree,work integrated learning and transition to the workplace. The revised degree commenced implementation in 2009. This paper focuses on the “real world” approach to the degree achieved through the first year programme, embedding and scaffolding law graduate capabilities through authentic and valid assessment and work integrated learning.
Resumo:
Drink driving causes more fatal crashes than any other single factor on Australian roads, with a third of crashes having alcohol as a contributing factor. In recent years there has been a plateau in the numbers of drink drivers apprehended by RBT, and around 12% of the general population in self report surveys admit to drinking and driving. There is limited information about the first offender group, particularly the subgroup of these offenders who admit to prior drink driving, the offence therefore being the “first time caught”. This research focuses on the differences between those who report drink driving prior to apprehension for the offence and those who don’t. Methods: 201 first time drink driving offenders were interviewed at the time of their court appearance. Information was collected on socio-demographic variables, driving behaviour, method of apprehension, offence information, alcohol use and self reported previous drink driving. Results: 78% of respondents reported that they had driven over the legal alcohol limit in the 6 months prior to the offence. Analyses revealed that those offenders who had driven over the limit previously without being caught were more likely to be younger and have an issue with risky drinking. When all variables were taken into account in a multivariate model using logistic regression, only risky drinking emerged as significantly related to past drink driving. High risk drinkers were 4.8 times more likely to report having driven over the limit without being apprehended in the previous 6 months. Conclusion: The majority of first offenders are those who are “first time apprehended” rather than “first time drink drivers”. Having an understanding of the differences between these groups may alter the focus of educational or rehabilitation countermeasures. This research is part of a larger project aiming to target first time apprehended offenders for tailored intervention.
Resumo:
The Internet presents a constantly evolving frontier for criminology and policing, especially in relation to online predators – paedophiles operating within the Internet for safer access to children, child pornography and networking opportunities with other online predators. The goals of this qualitative study are to undertake behavioural research – identify personality types and archetypes of online predators and compare and contrast them with behavioural profiles and other psychological research on offline paedophiles and sex offenders. It is also an endeavour to gather intelligence on the technological utilisation of online predators and conduct observational research on the social structures of online predator communities. These goals were achieved through the covert monitoring and logging of public activity within four Internet Relay Chat(rooms) (IRC) themed around child sexual abuse and which were located on the Undernet network. Five days of monitoring was conducted on these four chatrooms between Wednesday 1 to Sunday 5 April 2009; this raw data was collated and analysed. The analysis identified four personality types – the gentleman predator, the sadist, the businessman and the pretender – and eight archetypes consisting of the groomers, dealers, negotiators, roleplayers, networkers, chat requestors, posters and travellers. The characteristics and traits of these personality types and archetypes, which were extracted from the literature dealing with offline paedophiles and sex offenders, are detailed and contrasted against the online sexual predators identified within the chatrooms, revealing many similarities and interesting differences particularly with the businessman and pretender personality types. These personality types and archetypes were illustrated by selecting users who displayed the appropriate characteristics and tracking them through the four chatrooms, revealing intelligence data on the use of proxies servers – especially via the Tor software – and other security strategies such as Undernet’s host masking service. Name and age changes, which is used as a potential sexual grooming tactic was also revealed through the use of Analyst’s Notebook software and information on ISP information revealed the likelihood that many online predators were not using any safety mechanism and relying on the anonymity of the Internet. The activities of these online predators were analysed, especially in regards to child sexual grooming and the ‘posting’ of child pornography, which revealed a few of the methods in which online predators utilised new Internet technologies to sexually groom and abuse children – using technologies such as instant messengers, webcams and microphones – as well as store and disseminate illegal materials on image sharing websites and peer-to-peer software such as Gigatribe. Analysis of the social structures of the chatrooms was also carried out and the community functions and characteristics of each chatroom explored. The findings of this research have indicated several opportunities for further research. As a result of this research, recommendations are given on policy, prevention and response strategies with regards to online predators.
Resumo:
Real‐time kinematic (RTK) GPS techniques have been extensively developed for applications including surveying, structural monitoring, and machine automation. Limitations of the existing RTK techniques that hinder their applications for geodynamics purposes are twofold: (1) the achievable RTK accuracy is on the level of a few centimeters and the uncertainty of vertical component is 1.5–2 times worse than those of horizontal components and (2) the RTK position uncertainty grows in proportional to the base‐torover distances. The key limiting factor behind the problems is the significant effect of residual tropospheric errors on the positioning solutions, especially on the highly correlated height component. This paper develops the geometry‐specified troposphere decorrelation strategy to achieve the subcentimeter kinematic positioning accuracy in all three components. The key is to set up a relative zenith tropospheric delay (RZTD) parameter to absorb the residual tropospheric effects and to solve the established model as an ill‐posed problem using the regularization method. In order to compute a reasonable regularization parameter to obtain an optimal regularized solution, the covariance matrix of positional parameters estimated without the RZTD parameter, which is characterized by observation geometry, is used to replace the quadratic matrix of their “true” values. As a result, the regularization parameter is adaptively computed with variation of observation geometry. The experiment results show that new method can efficiently alleviate the model’s ill condition and stabilize the solution from a single data epoch. Compared to the results from the conventional least squares method, the new method can improve the longrange RTK solution precision from several centimeters to the subcentimeter in all components. More significantly, the precision of the height component is even higher. Several geosciences applications that require subcentimeter real‐time solutions can largely benefit from the proposed approach, such as monitoring of earthquakes and large dams in real‐time, high‐precision GPS leveling and refinement of the vertical datum. In addition, the high‐resolution RZTD solutions can contribute to effective recovery of tropospheric slant path delays in order to establish a 4‐D troposphere tomography.