881 resultados para data generation
Resumo:
The IEEE Subcommittee on the Application of Probability Methods (APM) published the IEEE Reliability Test System (RTS) [1] in 1979. This system provides a consistent and generally acceptable set of data that can be used both in generation capacity and in composite system reliability evaluation [2,3]. The test system provides a basis for the comparison of results obtained by different people using different methods. Prior to its publication, there was no general agreement on either the system or the data that should be used to demonstrate or test various techniques developed to conduct reliability studies. Development of reliability assessment techniques and programs are very dependent on the intent behind the development as the experience of one power utility with their system may be quite different from that of another utility. The development and the utilization of a reliability program are, therefore, greatly influenced by the experience of a utlity and the intent of the system manager, planner and designer conducting the reliability studies. The IEEE-RTS has proved to be extremely valuable in highlighting and comparing the capabilities (or incapabilities) of programs used in reliability studies, the differences in the perception of various power utilities and the differences in the solution techniques. The IEEE-RTS contains a reasonably large power network which can be difficult to use for initial studies in an educational environment.
Resumo:
The IEEE Reliability Test System (RTS) developed by the Application of Probability Method Subcommittee has been used to compare and test a wide range of generating capacity and composite system evaluation techniques and subsequent digital computer programs. A basic reliability test system is presented which has evolved from the reliability education and research programs conducted by the Power System Research Group at the University of Saskatchewan. The basic system data necessary for adequacy evaluation at the generation and composite generation and transmission system levels are presented together with the fundamental data required to conduct reliability-cost/reliability-worth evaluation
Resumo:
Buildings are key mediators between human activity and the environment around them, but details of energy usage and activity in buildings is often poorly communicated and understood. ECOS is an Eco-Visualization project that aims to contextualize the energy generation and consumption of a green building in a variety of different climates. The ECOS project is being developed for a large public interactive space installed in the new Science and Engineering Centre of the Queensland University of Technology that is dedicated to delivering interactive science education content to the public. This paper focuses on how design can develop ICT solutions from large data sets to create meaningful engagement with environmental data.
Resumo:
This item provides supplementary materials for the paper mentioned in the title, specifically a range of organisms used in the study. The full abstract for the main paper is as follows: Next Generation Sequencing (NGS) technologies have revolutionised molecular biology, allowing clinical sequencing to become a matter of routine. NGS data sets consist of short sequence reads obtained from the machine, given context and meaning through downstream assembly and annotation. For these techniques to operate successfully, the collected reads must be consistent with the assumed species or species group, and not corrupted in some way. The common bacterium Staphylococcus aureus may cause severe and life-threatening infections in humans,with some strains exhibiting antibiotic resistance. In this paper, we apply an SVM classifier to the important problem of distinguishing S. aureus sequencing projects from alternative pathogens, including closely related Staphylococci. Using a sequence k-mer representation, we achieve precision and recall above 95%, implicating features with important functional associations.
Resumo:
An online survey was conducted to investigate the views and experiences of Australian traffic and transport professionals about practical problems and issues in terms of trip generation and trip chaining for use in Transport Impact Assessment (TIA). Findings from this survey revealed that there is a shortage of appropriate data related to trip generation estimation for use in TIAs in Australia. Establishing a National Trip Generation Database (NTGD) with a centralised responsible organisation for collecting and publishing trip generation data based on federal and state governments’ contribution was found the most accepted solution for resolving this shortage as well as providing national standards and guidelines associated with trip generation definitions, data collection methodology, and TIA preparation process based on updated research. Finally, the study recognised the importance of the trip chaining effects on trip generation estimation and identified most prevalent land uses subject to trip chaining in terms of TIA.
Resumo:
Next Generation Sequencing (NGS) has revolutionised molec- ular biology, allowing routine clinical sequencing. NGS data consists of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. The common bacterium Staphylococcus aureus may cause severe and life-threatening infections in humans, with some strains exhibiting antibiotic resistance. Here we apply an SVM classifier to the important problem of distinguishing S. aureus sequencing projects from other pathogens, including closely related Staphylococci. Using a sequence k-mer representation, we achieve precision and recall above 95%, implicating features with important functional associations.
Resumo:
Whilst alcohol is a common feature of many social gatherings, there are numerous immediate and long-term health and social harms associated with its abuse. Alcohol consumption is the world’s third largest risk factor for disease and disability with almost 4% of all deaths worldwide attributed to alcohol. Not surprisingly, alcohol use and binge drinking by young people is of particular concern with Australian data reporting that 39% of young people (18-19yrs) admitted drinking at least weekly and 32% drank to levels that put them at risk of alcohol-related harm. The growing market penetration and connectivity of smartphones may be an opportunities for innovation in promoting health-related self-management of substance use. However, little is known about how best to harness and optimise this technology for health-related intervention and behaviour change. This paper explores the utility and interface of smartphone technology as a health intervention tool to monitor and moderate alcohol use. A review of the psychological health applications of this technology will be presented along with the findings of a series of focus groups, surveys and behavioural field trials of several drink-monitoring applications. Qualitative and quantitative data will be presented on the perceptions, preferences and utility of the design, usability and functionality of smartphone apps to monitoring and moderate alcohol use. How these findings have shaped the development and evolution of the OnTrack app will be specifically discussed, along with future directions and applications of this technology in health intervention, prevention and promotion.
Resumo:
The cyclic voltammetry behaviour of gold in aqueous media is often regarded in very simple terms as a combination of two distinct processes, double layer charging/discharging and monolayer oxide formation/removal. This view is questioned here on the basis of both the present results and earlier independent data by other authors. It was demonstrated in the present case that both severe cathodization or thermal pretreatment of polycrystalline gold in acid solution resulted in the appearance of substantial Faradaic responses in the double layer region. Such anamolous behaviour, as outlined recently also for other metals, is rationalized in terms of the presence of active metal atoms (which undergo premonolayer oxidation) at the electrode surface. Such behaviour, which is also assumed to correspond to that of active sites on conventional gold surfaces, is assumed to be of vital importance in electrocatalysis; in many instances the latter process is also quite marked in the double layer region.
Resumo:
This paper presents a model for the generation of a MAC tag using a stream cipher. The input message is used indirectly to control segments of the keystream that form the MAC tag. Several recent proposals can be considered as instances of this general model, as they all perform message accumulation in this way. However, they use slightly different processes in the message preparation and finalisation phases. We examine the security of this model for different options and against different types of attack, and conclude that the indirect injection model can be used to generate MAC tags securely for certain combinations of options. Careful consideration is required at the design stage to avoid combinations of options that result in susceptibility to forgery attacks. Additionally, some implementations may be vulnerable to side-channel attacks if used in Authenticated Encryption (AE) algorithms. We give design recommendations to provide resistance to these attacks for proposals following this model.
Resumo:
Distributed generation (DG) resources are commonly used in the electric systems to obtain minimum line losses, as one of the benefits of DG, in radial distribution systems. Studies have shown the importance of appropriate selection of location and size of DGs. This paper proposes an analytical method for solving optimal distributed generation placement (ODGP) problem to minimize line losses in radial distribution systems using loss sensitivity factor (LSF) based on bus-injection to branch-current (BIBC) matrix. The proposed method is formulated and tested on 12 and 34 bus radial distribution systems. The classical grid search algorithm based on successive load flows is employed to validate the results. The main advantages of the proposed method as compared with the other conventional methods are the robustness and no need to calculate and invert large admittance or Jacobian matrices. Therefore, the simulation time and the amount of computer memory, required for processing data especially for the large systems, decreases.
Resumo:
Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.
Resumo:
This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.
Resumo:
The dicoordinated borinium ion, dihydroxyborinium, B(OH)(2)(+) is generated from methyl boronic acid CH3B(OH)(2) by dissociative electron ionization and its connectivity confirmed by collisional activation. Neutralization-reionization (NR) experiments on this ion indicate that the neutral B(OH)(2) radical is a viable species in the gas phase. Both vertical neutralization of B(OH)(2)(+) and reionization of B(OH)(2) in the NR experiment are, however, associated with particularly unfavorable Franck-Condon factors. The differences in adiabatic and vertical electron transfer behavior can be traced back to a particular pi stabilization of the cationic species compared to the sp(2)-type neutral radical. Thermochemical data on several neutral and cationic boron compounds are presented based on calculations performed at the G2 level of theory.
Resumo:
The results of comprehensive experimental studies of the operation, stability, and plasma parameters of the low-frequency (0.46 MHz) inductively coupled plasmas sustained by the internal oscillating rf current are reported. The rf plasma is generated by using a custom-designed configuration of the internal rf coil that comprises two perpendicular sets of eight currents in each direction. Various diagnostic tools, such as magnetic probes, optical emission spectroscopy, and an rf-compensated Langmuir probe were used to investigate the electromagnetic, optical, and global properties of the argon plasma in wide ranges of the applied rf power and gas feedstock pressure. It is found that the uniformity of the electromagnetic field inside the plasma reactor is improved as compared to the conventional sources of inductively coupled plasmas with the external flat coil configuration. A reasonable agreement between the experimental data and computed electromagnetic field topography inside the chamber is reported. The Langmuir probe measurements reveal that the spatial profiles of the electron density, the effective electron temperature, plasma potential, and electron energy distribution/probability functions feature a high degree of the radial and axial uniformity and a weak azimuthal dependence, which is consistent with the earlier theoretical predictions. As the input rf power increases, the azimuthal dependence of the global plasma parameters vanishes. The obtained results demonstrate that by introducing the internal oscillated rf currents one can noticeably improve the uniformity of electromagnetic field topography, rf power deposition, and the plasma density in the reactor.
Resumo:
Next Generation Sequencing (NGS) has revolutionised molecular biology, resulting in an explosion of data sets and an increasing role in clinical practice. Such applications necessarily require rapid identification of the organism as a prelude to annotation and further analysis. NGS data consist of a substantial number of short sequence reads, given context through downstream assembly and annotation, a process requiring reads consistent with the assumed species or species group. Highly accurate results have been obtained for restricted sets using SVM classifiers, but such methods are difficult to parallelise and success depends on careful attention to feature selection. This work examines the problem at very large scale, using a mix of synthetic and real data with a view to determining the overall structure of the problem and the effectiveness of parallel ensembles of simpler classifiers (principally random forests) in addressing the challenges of large scale genomics.