982 resultados para Reliability (Engineering)
Resumo:
Engineering design processes are necessary to attain the requisite standards of integrity for high-assurance safety-related systems. Additionally, human factors design initiatives can provide critical insights that parameterise their development. Unfortunately, the popular perception of human factors as a “forced marriage” between engineering and psychology often provokes views where the ‘human factor’ is perceived as a threat to systems design. Some popular performance-based standards for developing safety-related systems advocate identifying and managing human factors throughout the system lifecycle. However, they also have a tendency to fall short in their guidance on the application of human factors methods and tools, let alone how the outputs generated can be integrated in to various stages of the design process. This case study describes a project that converged engineering with human factors to develop a safety argument for new low-cost railway level crossing technology for system-wide implementation in Australia. The paper enjoins the perspectives of a software engineer and cognitive psychologist and their involvement in the project over two years of collaborative work to develop a safety argument for low-cost level crossing technology. Safety and reliability requirements were informed by applying human factors analytical tools that supported the evaluation and quantification of human reliability where users interfaced with the technology. The project team was confronted with significant challenges in cross-disciplinary engagement, particularly with the complexities of dealing with incongruences in disciplinary language. They were also encouraged to think ‘outside the box’ as to how users of a system interpreted system states and ehaviour. Importantly, some of these states, while considered safe within the boundary of the constituent systems that implemented safety-related functions, could actually lead the users to engage in deviant behaviour. Psychology explained how user compliance could be eroded to levels that effectively undermined levels of risk reduction afforded by systems. Linking the engineering and psychology disciplines intuitively, overall safety performance was improved by introducing technical requirements and making design decisions that minimized the system states and behaviours that led to user deviancy. As a commentary on the utility of transdisciplinary collaboration for technical specification, the processes used to bridge the two disciplines are conceptualised in a graphical model.
Resumo:
Social Engineering (ES) is now considered the great security threat to people and organizations. Ever since the existence of human beings, fraudulent and deceptive people have used social engineering tricks and tactics to trick victims into obeying them. There are a number of social engineering techniques that are used in information technology to compromise security defences and attack people or organizations such as phishing, identity theft, spamming, impersonation, and spaying. Recently, researchers have suggested that social networking sites (SNSs) are the most common source and best breeding grounds for exploiting the vulnerabilities of people and launching a variety of social engineering based attacks. However, the literature shows a lack of information about what types of social engineering threats exist on SNSs. This study is part of a project that attempts to predict a persons’ vulnerability to SE based on demographic factors. In this paper, we demonstrate the different types of social engineering based attacks that exist on SNSs, the purposes of these attacks, reasons why people fell (or did not fall) for these attacks, based on users’ opinions. A qualitative questionnaire-based survey was conducted to collect and analyse people’s experiences with social engineering tricks, deceptions, or attacks on SNSs.
Resumo:
Social networking sites (SNSs), with their large number of users and large information base, seem to be the perfect breeding ground for exploiting the vulnerabilities of people, who are considered the weakest link in security. Deceiving, persuading, or influencing people to provide information or to perform an action that will benefit the attacker is known as “social engineering.” Fraudulent and deceptive people use social engineering traps and tactics through SNSs to trick users into obeying them, accepting threats, and falling victim to various crimes such as phishing, sexual abuse, financial abuse, identity theft, and physical crime. Although organizations, researchers, and practitioners recognize the serious risks of social engineering, there is a severe lack of understanding and control of such threats. This may be partly due to the complexity of human behaviors in approaching, accepting, and failing to recognize social engineering tricks. This research aims to investigate the impact of source characteristics on users’ susceptibility to social engineering victimization in SNSs, particularly Facebook. Using grounded theory method, we develop a model that explains what and how source characteristics influence Facebook users to judge the attacker as credible.
Resumo:
A secure protocol for electronic, sealed-bid, single item auctions is presented. The protocol caters to both first and second price (Vickrey) auctions and provides full price flexibility. Both computational and communication cost are linear with the number of bidders and utilize only standard cryptographic primitives. The protocol strictly divides knowledge of the bidder's identity and their actual bids between, respectively, a registration authority and an auctioneer, who are assumed not to collude but may be separately corrupt. This assures strong bidder-anonymity, though only weak bid privacy. The protocol is structured in two phases, each involving only off-line communication. Registration, requiring the use of the public key infrastructure, is simultaneous with hash-sealed bid-commitment and generates a receipt to the bidder containing a pseudonym. This phase is followed by encrypted bid-submission. Both phases involve the registration authority acting as a communication conduit but the actual message size is quite small. It is argued that this structure guarantees non-repudiation by both the winner and the auctioneer. Second price correctness is enforced either by observing the absence of registration of the claimed second-price bid or, where registered but lower than the actual second price, is subject to cooperation by the second price bidder - presumably motivated through self-interest. The use of the registration authority in other contexts is also considered with a view to developing an architecture for efficient secure multiparty transactions
Resumo:
This article presents the field applications and validations for the controlled Monte Carlo data generation scheme. This scheme was previously derived to assist the Mahalanobis squared distance–based damage identification method to cope with data-shortage problems which often cause inadequate data multinormality and unreliable identification outcome. To do so, real-vibration datasets from two actual civil engineering structures with such data (and identification) problems are selected as the test objects which are then shown to be in need of enhancement to consolidate their conditions. By utilizing the robust probability measures of the data condition indices in controlled Monte Carlo data generation and statistical sensitivity analysis of the Mahalanobis squared distance computational system, well-conditioned synthetic data generated by an optimal controlled Monte Carlo data generation configurations can be unbiasedly evaluated against those generated by other set-ups and against the original data. The analysis results reconfirm that controlled Monte Carlo data generation is able to overcome the shortage of observations, improve the data multinormality and enhance the reliability of the Mahalanobis squared distance–based damage identification method particularly with respect to false-positive errors. The results also highlight the dynamic structure of controlled Monte Carlo data generation that makes this scheme well adaptive to any type of input data with any (original) distributional condition.
Resumo:
Atmospheric-pressure plasma processing techniques emerge as efficient and convenient tools to engineer a variety of nanomaterials for advanced applications in nanoscience and nanotechnology. This work presents different methods, including using a quasi-sinusoidal high-voltage generator, a radio-frequency power supply, and a uni-polar pulse generator, to generate atmospheric-pressure plasmas in the jet or dielectric barrier discharge configurations. The applicability of the atmospheric-pressure plasma is exemplified by the surface modification of nanoparticles for polymeric nanocomposites. Dielectric measurements reveal that representative nanocomposites with plasma modified nanoparticles exhibit notably higher dielectric breakdown strength and a significantly extended lifetime.
Resumo:
The primary goal in hard tissue engineering is to combine high-performance scaffold materials with living cells to develop biologically active substitutes that can restore tissue functions. This requires relevant knowledge in multidisciplinary fields encompassing chemical engineering, material science, chemistry, biology and nanotechnology. Here we present an overview on the recent progress of how two representative carbon nanostructures, namely, carbon nanotubes and graphene, aid and advance the research in hard tissue engineering. The article focuses on the advantages and challenges of integrating these carbon nanostructures into functional scaffolds for repairing and regenerative purposes. It includes, but is not limited to, the critical physico-chemical properties of carbon nanomaterials for enhanced cell interactions such as adhesion, morphogenesis, proliferation and differentiation; the novel designs of two- and three-dimensional nanostructured scaffolds; multifunctional hybrid materials; and the biocompatible aspects of carbon nanotubes and graphene. Perspectives on the future research directions are also given, in an attempt to shed light on the innovative and rational design of more effective biomedical devices in hard tissue engineering.
Resumo:
Composite polymer insulators provide many advantages over the traditional porcelain insulators and they are increasingly being used at both transmission and distribution levels. In the present paper, an epoxy resin/silica nanocomposite dielectric material (NDM) structure is proposed and fabricated. Hydrophobic fumed silica is incorporated in epoxy resin matrix and acetone is adopted as media agent to effectively achieve homogenous dispersion of the nano-scale silica filler. The acetone also acts as diluents to reduce viscosity before the curing phase of epoxy resin and enables bubbles to escape from being trapped. Through partial discharge (PD) and surface aging tests, it is illustrated that the inception of surface discharge of the proposed NDM is relatively higher than that of the non-filled counterpart, and a better PD resistivity was observed in the negative half cycle regarding to applied AC voltage. Results of surface aging test indicate that surface discharge activity is retarded over the test conducting time. By contrast, surface discharge developed to the opposite way on the non-filled sample. Therefore, the proposed NDM could provide better safety reliability and lower maintenance cost to industrial application compared with nonfilled conventional epoxy resin.
Resumo:
This project recognized lack of data analysis and travel time prediction on arterials as the main gap in the current literature. For this purpose it first investigated reliability of data gathered by Bluetooth technology as a new cost effective method for data collection on arterial roads. Then by considering the similarity among varieties of daily travel time on different arterial routes, created a SARIMA model to predict future travel time values. Based on this research outcome, the created model can be applied for online short term travel time prediction in future.
Resumo:
This contribution provides arguments why and in which cases low-temperature plasmas should be used for nanoscale surface and interface engineering and discusses several advantages offered by plasma-based processes and tools compared to neutral gas fabrication routes. Relevant processes involve nanotexturing (etching, sputtering, nanostructuring, pre-patterning, etc.) and composition/structure control at nanoscales (phases, layering, elemental presence, doping, functionalization, etc.) and complex combinations thereof. A case study in p-Si/n-Si solar cell junction exemplifies a successful use of inductively coupled plasma-assisted RF magnetron sputtering for nanoscale fabrication of a bi-layered stack of unconventionally doped highly-crystalline silicon nanofilms with engineered high-quality interfaces.
Resumo:
A new source of low-frequency (0.46 MHz) inductively coupled plasmas sustained by the internal planar "unidirectional" RF current driven through a specially designed internal antenna configuration has been developed. The experimental results of the investigation of the optical and global argon plasma parameters by the optical and Langmuir probes are presented. It is shown that the spatial profiles of the electron density, the effective electron temperature and plasma potential feature a great deal of the radial and axial uniformity compared with conventional sources of inductively coupled plasmas with external at coil configurations. The measurements also reveal a weak azimuthal dependence of the global plasma parameters at low values of the input RF power, which was earlier predicted theoretically. The azimuthal dependence of the global plasma parameters vanishes at high input RF powers. Moreover, under certain conditions, the plasma becomes unstable due to spontaneous transitions between low-density (electrostatic, E) and high-density (electromagnetic, H) operating modes. Excellent uniformity of high-density plasmas makes the plasma reactor promising for various plasma processing applications and surface engineering.
Resumo:
Operators of hydroelectric power stations sometimes call upon engineers to modify existing hydroelectric turbines, usually several decades old, for improved maintainability and reliability. One common modification is the hybridisation of plain thrust pads to allow hydrostatic operation to reduce the risk of bearing wipe at low speed (virtually all new installations benefit from this feature). A modification such as this is not a difficult undertaking; however, there are numerous factors that need to be considered in order to maximize bearing performance. One factor that stands out above the others is whether the thrust bearing should be designed to lift the turbine immediately from the standing condition, which presents an interesting challenge: the recess has to have a sufficiently large area in order for the supply pressure to be able to overcome the dead weight of the turbine. If the combination of groove area and pressure is insufficient, then lifting is neither immediate nor guaranteed. This need not be a significant problem, as the bearings have exhibited adequate performance even in the absence of a hydrostatic lubricant supply. A case study is presented whereby relatively large hydrostatic recesses are added to the pads of thrust bearing. It is demonstrated with the aid of simple numerical modelling that the impact of the recess relative to the original pad is small under normal operating conditions. Most surprising, however, is that significant reductions in average oil film temperature and power dissipation are predicted.
Resumo:
The preventive maintenance of traction equipment for Very High Speed Trains (VHST) nowadays is becoming very expensive owing to the high complexity and quality of these components that require high reliability. An efficient maintenance approach like the Condition-Based Maintenance (CBM) should be implemented to reduce the costs. For this purpose, an experimental full-scale test rig for the CBM of VHST traction equipment has been designed to investigate in detail failures in the main mechanical components of system, i.e. motor, bearings and gearbox. The paper describes the main characteristics of this unique test rig, able to reproduce accurately the train operating conditions, including the relative movements of the motor, the gearbox and the wheel axle. Gearbox, bearing seats and motor are equipped by accelerometers, thermocouples, torque meter and other sensors in different positions. The testing results give important information about the most suitable sensor position and type to be installed for each component and show the effectiveness of the techniques used for the signal analysis in order to identify faults of the gearbox and motor bearings.
Resumo:
Vehicular Ad-hoc Networks (VANETs) can make roads safer, cleaner, and smarter. It can offer a wide range of services, which can be safety and non-safety related. Many safety-related VANETs applications are real-time and mission critical, which would require strict guarantee of security and reliability. Even non-safety related multimedia applications, which will play an important role in the future, will require security support. Lack of such security and privacy in VANETs is one of the key hindrances to the wide spread implementations of it. An insecure and unreliable VANET can be more dangerous than the system without VANET support. So it is essential to make sure that “life-critical safety” information is secure enough to rely on. Securing the VANETs along with appropriate protection of the privacy drivers or vehicle owners is a very challenging task. In this work we summarize the attacks, corresponding security requirements and challenges in VANETs. We also present the most popular generic security policies which are based on prevention as well detection methods. Many VANETs applications require system-wide security support rather than individual layer from the VANETs’ protocol stack. In this work we will review the existing works in the perspective of holistic approach of security. Finally, we will provide some possible future directions to achieve system-wide security as well as privacy-friendly security in VANETs.