933 resultados para Computer Generated Proofs


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a keyless and lightweight message transformation scheme based on the combinatorial design theory for the confidentiality of a message transmitted in multiple parts through a network with multiple independent paths, or for data stored in multiple parts by a set of independent storage services such as the cloud providers. Our combinatorial scheme disperses a message into v output parts so that (k-1) or less parts do not reveal any information about any message part, and the message can only be recovered by the party who possesses all v output parts. Combinatorial scheme generates an xor transformation structure to disperse the message into v output parts. Inversion is done by applying the same xor transformation structure on output parts. The structure is generated using generalized quadrangles from design theory which represents symmetric point and line incidence relations in a projective plane. We randomize our solution by adding a random salt value and dispersing it together with the message. We show that a passive adversary with capability of accessing (k-1) communication links or storage services has no advantage so that the scheme is indistinguishable under adaptive chosen ciphertext attack (IND-CCA2).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a statistical aircraft trajectory clustering approach aimed at discriminating between typical manned and expected unmanned traffic patterns. First, a resampled version of each trajectory is modelled using a mixture of Von Mises distributions (circular statistics). Second, the remodelled trajectories are globally aligned using tools from bioinformatics. Third, the alignment scores are used to cluster the trajectories using an iterative k-medoids approach and an appropriate distance function. The approach is then evaluated using synthetically generated unmanned aircraft flights combined with real air traffic position reports taken over a sector of Northern Queensland, Australia. Results suggest that the technique is useful in distinguishing between expected unmanned and manned aircraft traffic behaviour, as well as identifying some common conventional air traffic patterns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use of Genetic Programming (GP) to create an approximate model for the non-linear relationship between flexural stiffness, length, mass per unit length and rotation speed associated with rotating beams and their natural frequencies. GP, a relatively new form of artificial intelligence, is derived from the Darwinian concept of evolution and genetics and it creates computer programs to solve problems by manipulating their tree structures. GP predicts the size and structural complexity of the empirical model by minimizing the mean square error at the specified points of input-output relationship dataset. This dataset is generated using a finite element model. The validity of the GP-generated model is tested by comparing the natural frequencies at training and at additional input data points. It is found that by using a non-dimensional stiffness, it is possible to get simple and accurate function approximation for the natural frequency. This function approximation model is then used to study the relationships between natural frequency and various influencing parameters for uniform and tapered beams. The relations obtained with GP model agree well with FEM results and can be used for preliminary design and structural optimization studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a novel, language-neutral approach for searching online handwritten text using Frechet distance. Online handwritten data, which is available as a time series (x,y,t), is treated as representing a parameterized curve in two-dimensions and the problem of searching online handwritten text is posed as a problem of matching two curves in a two-dimensional Euclidean space. Frechet distance is a natural measure for matching curves. The main contribution of this paper is the formulation of a variant of Frechet distance that can be used for retrieving words even when only a prefix of the word is given as query. Extensive experiments on UNIPEN dataset(1) consisting of over 16,000 words written by 7 users show that our method outperforms the state-of-the-art DTW method. Experiments were also conducted on a Multilingual dataset, generated on a PDA, with encouraging results. Our approach can be used to implement useful, exciting features like auto-completion of handwriting in PDAs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we consider the problems of computing a minimum co-cycle basis and a minimum weakly fundamental co-cycle basis of a directed graph G. A co-cycle in G corresponds to a vertex partition (S,V ∖ S) and a { − 1,0,1} edge incidence vector is associated with each co-cycle. The vector space over ℚ generated by these vectors is the co-cycle space of G. Alternately, the co-cycle space is the orthogonal complement of the cycle space of G. The minimum co-cycle basis problem asks for a set of co-cycles that span the co-cycle space of G and whose sum of weights is minimum. Weakly fundamental co-cycle bases are a special class of co-cycle bases, these form a natural superclass of strictly fundamental co-cycle bases and it is known that computing a minimum weight strictly fundamental co-cycle basis is NP-hard. We show that the co-cycle basis corresponding to the cuts of a Gomory-Hu tree of the underlying undirected graph of G is a minimum co-cycle basis of G and it is also weakly fundamental.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of computing an approximate minimum cycle basis of an undirected edge-weighted graph G with m edges and n vertices; the extension to directed graphs is also discussed. In this problem, a {0,1} incidence vector is associated with each cycle and the vector space over F-2 generated by these vectors is the cycle space of G. A set of cycles is called a cycle basis of G if it forms a basis for its cycle space. A cycle basis where the sum of the weights of the cycles is minimum is called a minimum cycle basis of G. Cycle bases of low weight are useful in a number of contexts, e.g. the analysis of electrical networks, structural engineering, chemistry, and surface reconstruction. We present two new algorithms to compute an approximate minimum cycle basis. For any integer k >= 1, we give (2k - 1)-approximation algorithms with expected running time 0(kmn(1+2/k) + mn((1+1/k)(omega-1))) and deterministic running time 0(n(3+2/k)), respectively. Here omega is the best exponent of matrix multiplication. It is presently known that omega < 2.376. Both algorithms are o(m(omega)) for dense graphs. This is the first time that any algorithm which computes sparse cycle bases with a guarantee drops below the Theta(m(omega)) bound. We also present a 2-approximation algorithm with O(m(omega) root n log n) expected running time, a linear time 2-approximation algorithm for planar graphs and an O(n(3)) time 2.42-approximation algorithm for the complete Euclidean graph in the plane.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose In the oncology population where malnutrition prevalence is high, more descriptive screening tools can provide further information to assist triaging and capture acute change. The Patient-Generated Subjective Global Assessment Short Form (PG-SGA SF) is a component of a nutritional assessment tool which could be used for descriptive nutrition screening. The purpose of this study was to conduct a secondary analysis of nutrition screening and assessment data to identify the most relevant information contributing to the PG-SGA SF to identify malnutrition risk with high sensitivity and specificity. Methods This was an observational, cross-sectional study of 300 consecutive adult patients receiving ambulatory anti-cancer treatment at an Australian tertiary hospital. Anthropometric and patient descriptive data were collected. The scored PG-SGA generated a score for nutritional risk (PG-SGA SF) and a global rating for nutrition status. Receiver operating characteristic curves (ROC) were generated to determine optimal cut-off scores for combinations of the PG-SGA SF boxes with the greatest sensitivity and specificity for predicting malnutrition according to scored PG-SGA global rating. Results The additive scores of boxes 1–3 had the highest sensitivity (90.2 %) while maintaining satisfactory specificity (67.5 %) and demonstrating high diagnostic value (AUC = 0.85, 95 % CI = 0.81–0.89). The inclusion of box 4 (PG-SGA SF) did not add further value as a screening tool (AUC = 0.85, 95 % CI = 0.80–0.89; sensitivity 80.4 %; specificity 72.3 %). Conclusions The validity of the PG-SGA SF in chemotherapy outpatients was confirmed. The present study however demonstrated that the functional capacity question (box 4) does not improve the overall discriminatory value of the PG-SGA SF.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality is influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigations in four urban residential catchments based at Gold Coast, Australia, and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling results confirmed that high intensity-short duration events produce 58.0% of TS load while they only generated 29.1% of total runoff volume. Additionally, rainfall events smaller than 6-month average recurrence interval (ARI) generates a greater cumulative runoff volume (68.4% of the total annual runoff volume) and TS load (68.6% of the TS load exported) than the rainfall events larger than 6-month ARI. The results suggest that for the study catchments, stormwater treatment design could be based on the rainfall which had a mean value of 31 mm/h average intensity and 0.4 h duration. These outcomes also confirmed that selecting smaller ARI rainfall events with high intensity-short duration as the threshold for treatment system design is the most feasible approach since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In quantum theory, symmetry has to be defined necessarily in terms of the family of unit rays, the state space. The theorem of Wigner asserts that a symmetry so defined at the level of rays can always be lifted into a linear unitary or an antilinear antiunitary operator acting on the underlying Hilbert space. We present two proofs of this theorem which are both elementary and economical. Central to our proofs is the recognition that a given Wigner symmetry can, by post-multiplication by a unitary symmetry, be taken into either the identity or complex conjugation. Our analysis often focuses on the behaviour of certain two-dimensional subspaces of the Hilbert space under the action of a given Wigner symmetry, but the relevance of this behaviour to the larger picture of the whole Hilbert space is made transparent at every stage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exploring emotions is a defining feature of psychotherapy. This study explores how therapists explore emotions when they cannot see or hear their clients. In analysing 1,279 sessions of online text-based Cognitive Behavioural Therapy (CBT) we focused on therapists’ commiserations (e.g., “I’m sorry to hear that”) and their affective inferences (e.g., “that sounds very scary for you”). Both practices routinely prefaced moves to pursue a range of therapeutic activities, many of which did not prioritise sustained focus on the emotion that had just been oriented to. By separating message composition from message transmission, the modality used for these therapy sessions enabled therapists to combine orientations to emotion with attempts to shift the focus of discussion. Our analysis finds that although physically co-present and computer-mediated psychotherapy share a common focus on emotional experience, the modality used for therapy can be relevant in the design and use of these orientations. Data are in British English.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fusion power is an appealing source of clean and abundant energy. The radiation resistance of reactor materials is one of the greatest obstacles on the path towards commercial fusion power. These materials are subject to a harsh radiation environment, and cannot fail mechanically or contaminate the fusion plasma. Moreover, for a power plant to be economically viable, the reactor materials must withstand long operation times, with little maintenance. The fusion reactor materials will contain hydrogen and helium, due to deposition from the plasma and nuclear reactions because of energetic neutron irradiation. The first wall divertor materials, carbon and tungsten in existing and planned test reactors, will be subject to intense bombardment of low energy deuterium and helium, which erodes and modifies the surface. All reactor materials, including the structural steel, will suffer irradiation of high energy neutrons, causing displacement cascade damage. Molecular dynamics simulation is a valuable tool for studying irradiation phenomena, such as surface bombardment and the onset of primary damage due to displacement cascades. The governing mechanisms are on the atomic level, and hence not easily studied experimentally. In order to model materials, interatomic potentials are needed to describe the interaction between the atoms. In this thesis, new interatomic potentials were developed for the tungsten-carbon-hydrogen system and for iron-helium and chromium-helium. Thus, the study of previously inaccessible systems was made possible, in particular the effect of H and He on radiation damage. The potentials were based on experimental and ab initio data from the literature, as well as density-functional theory calculations performed in this work. As a model for ferritic steel, iron-chromium with 10% Cr was studied. The difference between Fe and FeCr was shown to be negligible for threshold displacement energies. The properties of small He and He-vacancy clusters in Fe and FeCr were also investigated. The clusters were found to be more mobile and dissociate more rapidly than previously assumed, and the effect of Cr was small. The primary damage formed by displacement cascades was found to be heavily influenced by the presence of He, both in FeCr and W. Many important issues with fusion reactor materials remain poorly understood, and will require a huge effort by the international community. The development of potential models for new materials and the simulations performed in this thesis reveal many interesting features, but also serve as a platform for further studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a concept for a collision avoidance system for ships, which is based on model predictive control. A finite set of alternative control behaviors are generated by varying two parameters: offsets to the guidance course angle commanded to the autopilot and changes to the propulsion command ranging from nominal speed to full reverse. Using simulated predictions of the trajectories of the obstacles and ship, compliance with the Convention on the International Regulations for Preventing Collisions at Sea and collision hazards associated with each of the alternative control behaviors are evaluated on a finite prediction horizon, and the optimal control behavior is selected. Robustness to sensing error, predicted obstacle behavior, and environmental conditions can be ensured by evaluating multiple scenarios for each control behavior. The method is conceptually and computationally simple and yet quite versatile as it can account for the dynamics of the ship, the dynamics of the steering and propulsion system, forces due to wind and ocean current, and any number of obstacles. Simulations show that the method is effective and can manage complex scenarios with multiple dynamic obstacles and uncertainty associated with sensors and predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A hybrid computer for structure factor calculations in X-ray crystallography is described. The computer can calculate three-dimensional structure factors of up to 24 atoms in a single run and can generate the scatter functions of well over 100 atoms using Vand et al., or Forsyth and Wells approximations. The computer is essentially a digital computer with analog function generators, thus combining to advantage the economic data storage of digital systems and simple computing circuitry of analog systems. The digital part serially selects the data, computes and feeds the arguments into specially developed high precision digital-analog function generators, the outputs of which being d.c. voltages, are further processed by analog circuits and finally the sequential adder, which employs a novel digital voltmeter circuit, converts them back into digital form and accumulates them in a dekatron counter which displays the final result. The computer is also capable of carrying out 1-, 2-, or 3-dimensional Fourier summation, although in this case, the lack of sufficient storage space for the large number of coefficients involved, is a serious limitation at present.