136 resultados para 100602 Input Output and Data Devices
Resumo:
We address the problem of finite horizon optimal control of discrete-time linear systems with input constraints and uncertainty. The uncertainty for the problem analysed is related to incomplete state information (output feedback) and stochastic disturbances. We analyse the complexities associated with finding optimal solutions. We also consider two suboptimal strategies that could be employed for larger optimization horizons.
Resumo:
A bioeconomic model was developed to evaluate the potential performance of brown tiger prawn stock enhancement in Exmouth Gulf, Australia. This paper presents the framework for the bioeconomic model and risk assessment for all components of a stock enhancement operation, i.e. hatchery, grow-out, releasing, population dynamics, fishery, and monitoring, for a commercial scale enhancement of about 100 metric tonnes, a 25% increase in average annual catch in Exmouth Gulf. The model incorporates uncertainty in estimates of parameters by using a distribution for the parameter over a certain range, based on experiments, published data, or similar studies. Monte Carlo simulation was then used to quantify the effects of these uncertainties on the model-output and on the economic potential of a particular production target. The model incorporates density-dependent effects in the nursery grounds of brown tiger prawns. The results predict that a release of 21 million 1 g prawns would produce an estimated enhanced prawn catch of about 100 t. This scale of enhancement has a 66.5% chance of making a profit. The largest contributor to the overall uncertainty of the enhanced prawn catch was the post-release mortality, followed by the density-dependent mortality caused by released prawns. These two mortality rates are most difficult to estimate in practice and are much under-researched in stock enhancement.
Resumo:
This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.
Resumo:
A high-frequency-link (HFL) micro inverter with a front-end diode clamped multi-level inverter and a grid-connected half-wave cycloconverter is proposed. The diode clamped multi-level inverter with an auxiliary capacitor is used to generate high-frequency (HF) three level quasi square-wave output and it is fed into a series resonant tank to obtain high frequency continuous sinusoidal current. The obtained continuous sinusoidal current is modulated by using the grid-connected half-wave cycloconverter to obtain grid synchronized output current in phase with the grid voltage. The phase shift power modulation is used with auxiliary capacitor at the front-end multi-level inverter to have soft-switching. The phase shift between the HFL resonant current and half-wave cycloconverter input voltage is modulated to obtain grid synchronized output current.
Resumo:
The design and implementation of environmental policy often involve more than one pollutant, and must consider pollution as a byproduct of the production of marketable output. In this paper, we test the implicit assumption in the empirical literature that (1) production of marketable output, pollution and abatement are separable, and (2) different pollutants can be abated separately. Using unique plant-level data in India, we reject the null hypotheses of separability between marketable output and pollutants, and between different pollutants. Firms must incur abatement costs for reducing pollution levels. In addition, complement and substitute relationships between water pollutants are demonstrated with statistical significance.
Resumo:
This paper translates the concepts of sustainable production to three dimensions of economic, environmental and ecological sustainability to analyze optimal production scales by solving optimizing problems. Economic optimization seeks input-output combinations to maximize profits. Environmental optimization searches for input-output combinations that minimize the polluting effects of materials balance on the surrounding environment. Ecological optimization looks for input-output combinations that minimize the cumulative destruction of the entire ecosystem. Using an aggregate space, the framework illustrates that these optimal scales are often not identical because markets fail to account for all negative externalities. Profit-maximizing firms normally operate at the scales which are larger than optimal scales from the viewpoints of environmental and ecological sustainability; hence policy interventions are favoured. The framework offers a useful tool for efficiency studies and policy implication analysis. The paper provides an empirical investigation using a data set of rice farms in South Korea.
Resumo:
Supervisory Control and Data Acquisition systems (SCADA) are widely used to control critical infrastructure automatically. Capturing and analyzing packet-level traffic flowing through such a network is an essential requirement for problems such as legacy network mapping and fault detection. Within the framework of captured network traffic, we present a simple modeling technique, which supports the mapping of the SCADA network topology via traffic monitoring. By characterizing atomic network components in terms of their input-output topology and the relationship between their data traffic logs, we show that these modeling primitives have good compositional behaviour, which allows complex networks to be modeled. Finally, the predictions generated by our model are found to be in good agreement with experimentally obtained traffic.
Resumo:
This paper addresses the issue of output feedback model predictive control for linear systems with input constraints and stochastic disturbances. We show that the optimal policy uses the Kalman filter for state estimation, but the resultant state estimates are not utilized in a certainty equivalence control law
Resumo:
This research has successfully developed a novel synthetic structural health monitoring system model that is cost-effective and flexible in sensing and data acquisition; and robust in the structural safety evaluation aspect for the purpose of long-term and frequent monitoring of large-scale civil infrastructure during their service lives. Not only did it establish a real-world structural monitoring test-bed right at the heart of QUT Gardens Point Campus but it can also facilitate reliable and prompt protection for any built infrastructure system as well as the user community involved.
Resumo:
PURPOSE
The purposes of this study were to:
1) establish inter-instrument reliability between left and right hip accelerometer placement;
2) examine procedural reliability of a walking protocol used to measure physical activity (PA), and;
3) confirm concurrent validity of accelerometers in measuring PA intensity as compared to the gold standard of oxygen consumption measured by indirect calorimetry.
METHODS
Eight children (mean age: 11.9; SD: 3.2, 75% male) with CP (GMFCS levels I-III) wore ActiGraph GT3X accelerometers on each hip and the Cosmed K4b
Resumo:
Identifying product families has been considered as an effective way to accommodate the increasing product varieties across the diverse market niches. In this paper, we propose a novel framework to identifying product families by using a similarity measure for a common product design data BOM (Bill of Materials) based on data mining techniques such as frequent mining and clus-tering. For calculating the similarity between BOMs, a novel Extended Augmented Adjacency Matrix (EAAM) representation is introduced that consists of information not only of the content and topology but also of the fre-quent structural dependency among the various parts of a product design. These EAAM representations of BOMs are compared to calculate the similarity between products and used as a clustering input to group the product fami-lies. When applied on a real-life manufacturing data, the proposed framework outperforms a current baseline that uses orthogonal Procrustes for grouping product families.
Resumo:
In this chapter, we draw out the relevant themes from a range of critical scholarship from the small body of digital media and software studies work that has focused on the politics of Twitter data and the sociotechnical means by which access is regulated. We highlight in particular the contested relationships between social media research (in both academic and non-academic contexts) and the data wholesale, retail, and analytics industries that feed on them. In the second major section of the chapter we discuss in detail the pragmatic edge of these politics in terms of what kinds of scientific research is and is not possible in the current political economy of Twitter data access. Finally, at the end of the chapter we return to the much broader implications of these issues for the politics of knowledge, demonstrating how the apparently microscopic level of how the Twitter API mediates access to Twitter data actually inscribes and influences the macro level of the global political economy of science itself, through re-inscribing institutional and traditional disciplinary privilege We conclude with some speculations about future developments in data rights and data philanthropy that may at least mitigate some of these negative impacts.
Resumo:
Cryptographic hash functions are an important tool of cryptography and play a fundamental role in efficient and secure information processing. A hash function processes an arbitrary finite length input message to a fixed length output referred to as the hash value. As a security requirement, a hash value should not serve as an image for two distinct input messages and it should be difficult to find the input message from a given hash value. Secure hash functions serve data integrity, non-repudiation and authenticity of the source in conjunction with the digital signature schemes. Keyed hash functions, also called message authentication codes (MACs) serve data integrity and data origin authentication in the secret key setting. The building blocks of hash functions can be designed using block ciphers, modular arithmetic or from scratch. The design principles of the popular Merkle–Damgård construction are followed in almost all widely used standard hash functions such as MD5 and SHA-1.
Resumo:
Improving the performance of health sector is one of the most popular issues in Australia. This paper contributes to this important policy debate by examining the efficiency of health facilities in Queensland using the Malmquist Productivity Index (MPI). This method is selected because it is suitable for the multi-input, multi-output, and not-for-profit natures of public health services. In addition, with the availability of panel data we can decompose productivity growth into useful components, including technical efficiency changes, technological changes and scale changes. The results revealed an average of 1.6 per cent of growth in total factor productivity (TFP) among Queensland public hospitals in the study period. The main component contributing to the modest improvement of TFP during the period was catching-up at an average of 1.0 per cent. SFA estimates suggest that the number of nurses is the most influential determinant of output.