690 resultados para Computer Controlled Signals.
Resumo:
Networked control systems (NCSs) offer many advantages over conventional control; however, they also demonstrate challenging problems such as network-induced delay and packet losses. This paper proposes an approach of predictive compensation for simultaneous network-induced delays and packet losses. Different from the majority of existing NCS control methods, the proposed approach addresses co-design of both network and controller. It also alleviates the requirements of precise process models and full understanding of NCS network dynamics. For a series of possible sensor-to-actuator delays, the controller computes a series of corresponding redundant control values. Then, it sends out those control values in a single packet to the actuator. Once receiving the control packet, the actuator measures the actual sensor-to-actuator delay and computes the control signals from the control packet. When packet dropout occurs, the actuator utilizes past control packets to generate an appropriate control signal. The effectiveness of the approach is demonstrated through examples.
Resumo:
Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.
Resumo:
Internet and computer addiction has been a popular research area since the 90s. Studies on Internet and computer addiction have usually been conducted in the US, and the investigation of computer and Internet addiction at different countries is an interesting area of research. This study investigates computer and Internet addiction among teenagers and Internet cafe visitors in Turkey. We applied a survey to 983 visitors in the Internet cafes. The results show that the Internet cafe visitors are usually teenagers, mostly middle and high-school students and usually are busy with computer and Internet applications like chat, e-mail, browsing and games. The teenagers come to the Internet cafe to spend time with friends and the computers. In addition, about 30% of cafe visitors admit to having an Internet addiction, and about 20% specifically mention the problems that they are having with the Internet. It is rather alarming to consider the types of activities that the teenagers are performing in an Internet cafe, their reasons for being there, the percentage of self-awareness about Internet addiction, and the lack of control of applications in the cafe.
Resumo:
To the action researcher, who laboriously spends his or her hours working within the local contexts of communities or organisations to co-generate meaningful research, and who’s theories are hardened on the anvil of creating meaningful social change; futures studies might seem the discipline the most peripheral to its interests, and the most ill equipped to deal with the local and intimate domain of community existence. To the futurist, who laboriously spends his or her hours understanding the nuances of history and social change, who through persistent work, begins to make sense of the weak signals and the subtle shifts, action research would seem as simply an auxiliary field, inappropriate for understanding the greater scheme. I invite the reader, however, whether they belong to one camp or the other, to let go of their respective disciplinary perspectives, and see both belonging to each other. [Introduction] .
Resumo:
A central topic in economics is the existence of social preferences. Behavioural economics in general has approached the issue from several angles. Controlled experimental settings, surveys, and field experiments are able to show that in a number of economic environments, people usually care about immaterial things such as fairness or equity of allocations. Findings from experimental economics specifically have lead to large increase in theories addressing social preferences. Most (pro)social phenomena are well understood in the experimental settings but very difficult to observe 'in the wild'. One criticism in this regard is that many findings are bound by the artificial environment of the computer lab or survey method used. A further criticism is that the traditional methods also fail to directly attribute the observed behaviour to the mental constructs that are expected to stand behind them. This thesis will first examine the usefulness of sports data to test social preference models in a field environment, thus overcoming limitations of the lab with regards to applicability to other - non-artificial - environments. The second major contribution of this research establishes a new neuroscientific tool - the measurement of the heart rate variability - to observe participants' emotional reactions in a traditional experimental setup.
Resumo:
This paper presents techniques which can lead to diagnosis of faults in a small size multi-cylinder diesel engine. Preliminary analysis of the acoustic emission (AE) signals is outline, including time-frequency analysis and selection of optimum frequency band.The results of applying mean field independent component analysis (MFICA) to separate the AE root mean square (RMS) signals and the effects of changing parameter values are also outlined. The results on separation of RMS signals show thsi technique has the potential of increasing the probability to successfully identify the AE events associated with the various mechanical events within the combustion process of multi-cylinder diesel engines.
Resumo:
Aims: To determine whether incorporation of patient peer supporters in a Cardiac-Diabetes Self-Management Program (Peer-CDSMP) led to greater improvement in self-efficacy, knowledge and self-management behaviour in the intervention group compared to a control group. Background: Promoting improved self-management for those with diabetes and a cardiac condition is enhanced by raising motivation and providing a model. Peer support from former patients who are able to successfully manage similar conditions could enhance patient motivation to achieve better health outcomes and provide a model of how such management can be achieved. While studies on peer support have demonstrated the potential of peers in promoting self-management, none have examined the impact on patients with two comorbidities. Methods: A randomised controlled trial was used to develop and evaluate the effectiveness of the Peer-CDSMP from August 2009 to December 2010. Thirty cardiac patients with type 2 diabetes were recruited. The study commenced in an acute hospital, follow up at participants’ homes in Brisbane Australia. Results: While both the control and intervention groups had improved self-care behaviour, self-efficacy and knowledge, the improvement in knowledge was significantly greater for the intervention group. Conclusions: Significant improvement in knowledge was achieved for the intervention group. Absence of significant improvements in self-efficacy and self-care behaviour represents an inconclusive effect; further studies with larger sample sizes are recommended.
Resumo:
Carbon nanotubes (CNTs), experimentally observed for the first time twenty years ago, have triggered an unprecedented research effort, on the account of their astonishing structural, mechanical and electronic properties. Unfortunately, the current inability in predicting the CNTs’ properties and the difficulty in controlling their position on a substrate are often limiting factors for the application of this material in actual devices. This research aims at the creation of specific methodologies for controlled synthesis of CNTs, leading to effectively employ them in various fields of electronics, e.g. photovoltaics. Focused Ion Beam (FIB) patterning of Si surfaces is here proposed as a means for ordering the assembly of vertical-aligned CNTs. With this technique, substrates with specific nano-structured morphologies have been prepared, enabling a high degree of control over CNTs’ position and size. On these nano-structured substrates, the growth of CNTs has been realized by chemical vapor deposition (CVD), i.e. thermal decomposition of hydrocarbon gases over a heated catalyst. The most common materials used as catalysts in CVD are transition metals like Fe and Ni; however, their presence in the CNT products often results in shortcomings for electronic applications, especially for those based on silicon, being the metallic impurities incompatible with very-large-scale integration (VLSI) technology. In the present work the role of Ge dots as an alternative catalysts for CNTs synthesis on Si substrates has been thoroughly assessed, finding a close connection between the catalytic activity of such material and the CVD conditions, which can affect both size and morphology of the dots. Successful CNT growths from Ge dots have been obtained by CVD at temperatures ranging from 750 to 1000°C, with mixtures of acetylene and hydrogen in an argon carrier gas. The morphology of the Si surface is observed to play a crucial role for the outcome of the CNT synthesis: natural (i.e. chemical etching) and artificial (i.e. FIB patterning, nanoindentation) means of altering this morphology in a controlled way have been then explored to optimize the CNTs yield. All the knowledge acquired in this study has been finally applied to synthesize CNTs on transparent conductive electrodes (indium-tin oxide, ITO, coated glasses), for the creation of a new class of anodes for organic photovoltaics. An accurate procedure has been established which guarantees a controlled inclusion of CNTs on ITO films, preserving their optical and electrical properties. By using this set of conditions, a CNTenhanced electrode has been built, contributing to improve the power conversion efficiency of polymeric solar cells.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
We report a method for controlling the exposed facets and hence the dimensionality and shape of ZnO nanocrystals using a non-hydrolytic aminolysis synthesis route. The effects of changes to reaction conditions on ZnO formation were investigated and possible self-assembly mechanisms proposed. The crystal facet growth and hence morphologies of the ZnO nanocrystals were controlled by varying reaction temperature and the reactant ratio. Four distinct ZnO nanocrystal types were produced (nanocones, nanobullets, nanorods and nanoplates). The relative photocatalytic activities of the exposed facets of these ZnO nanostructures were also examined, which showed the activities obviously depended on the reactivity of exposed crystal facets in the order: {1011}>>{0001}, {1010}.
Resumo:
Medical industries have brought Information Technology (IT) in their systems for both patients and medical staffs due to the numerous benefits of IT we experience at presently. Moreover, the Mobile healthcare (M-health) system has been developed as the first step of Ubiquitous Health Environment (UHE). With the mobility and multi-functions, M-health system will be able to provide more efficient and various services for both doctors and patients. Due to the invisible feature of mobile signals, hackers have easier access to hospital networks than wired network systems. This may result in several security incidents unless security protocols are well implemented. In this paper, user authentication and authorization procedures will applied as a featured component at each level of M-health systems inthe hospital environment. Accordingly, M-health system in the hospital will meet the optimal requirements as a countermeasure to its vulnerabilities.
Resumo:
Sequence data often have competing signals that are detected by network programs or Lento plots. Such data can be formed by generating sequences on more than one tree, and combining the results, a mixture model. We report that with such mixture models, the estimates of edge (branch) lengths from maximum likelihood (ML) methods that assume a single tree are biased. Based on the observed number of competing signals in real data, such a bias of ML is expected to occur frequently. Because network methods can recover competing signals more accurately, there is a need for ML methods allowing a network. A fundamental problem is that mixture models can have more parameters than can be recovered from the data, so that some mixtures are not, in principle, identifiable. We recommend that network programs be incorporated into best practice analysis, along with ML and Bayesian trees.
Resumo:
This paper discusses commonly encountered diesel engine problems and the underlying combustion related faults. Also discussed are the methods used in previous studies to simulate diesel engine faults and the initial results of an experimental simulation of a common combustion related diesel engine fault, namely diesel engine misfire. This experimental fault simulation represents the first step towards a comprehensive investigation and analysis into the characteristics of acoustic emission signals arising from combustion related diesel engine faults. Data corresponding to different engine running conditions was captured using in-cylinder pressure, vibration and acoustic emission transducers along with both crank-angle encoder and top-dead centre signals. Using these signals, it was possible to characterise the diesel engine in-cylinder pressure profiles and the effect of different combustion conditions on both vibration and acoustic emission signals.
Resumo:
The ability to perform autonomous emergency (forced) landings is one of the key technology enablers identified for UAS. This paper presents the flight test results of forced landings involving a UAS, in a controlled environment, and which was conducted to ascertain the performances of previously developed (and published) path planning and guidance algorithms. These novel 3-D nonlinear algorithms have been designed to control the vehicle in both the lateral and longitudinal planes of motion. These algorithms have hitherto been verified in simulation. A modified Boomerang 60 RC aircraft is used as the flight test platform, with associated onboard and ground support equipment sourced Off-the-Shelf or developed in-house at the Australian Research Centre for Aerospace Automation(ARCAA). HITL simulations were conducted prior to the flight tests and displayed good landing performance, however, due to certain identified interfacing errors, the flight results differed from that obtained in simulation. This paper details the lessons learnt and presents a plausible solution for the way forward.