823 resultados para Business -- Data processing -- Management


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reliability of carrier phase ambiguity resolution (AR) of an integer least-squares (ILS) problem depends on ambiguity success rate (ASR), which in practice can be well approximated by the success probability of integer bootstrapping solutions. With the current GPS constellation, sufficiently high ASR of geometry-based model can only be achievable at certain percentage of time. As a result, high reliability of AR cannot be assured by the single constellation. In the event of dual constellations system (DCS), for example, GPS and Beidou, which provide more satellites in view, users can expect significant performance benefits such as AR reliability and high precision positioning solutions. Simply using all the satellites in view for AR and positioning is a straightforward solution, but does not necessarily lead to high reliability as it is hoped. The paper presents an alternative approach that selects a subset of the visible satellites to achieve a higher reliability performance of the AR solutions in a multi-GNSS environment, instead of using all the satellites. Traditionally, satellite selection algorithms are mostly based on the position dilution of precision (PDOP) in order to meet accuracy requirements. In this contribution, some reliability criteria are introduced for GNSS satellite selection, and a novel satellite selection algorithm for reliable ambiguity resolution (SARA) is developed. The SARA algorithm allows receivers to select a subset of satellites for achieving high ASR such as above 0.99. Numerical results from a simulated dual constellation cases show that with the SARA procedure, the percentages of ASR values in excess of 0.99 and the percentages of ratio-test values passing the threshold 3 are both higher than those directly using all satellites in view, particularly in the case of dual-constellation, the percentages of ASRs (>0.99) and ratio-test values (>3) could be as high as 98.0 and 98.5 % respectively, compared to 18.1 and 25.0 % without satellite selection process. It is also worth noting that the implementation of SARA is simple and the computation time is low, which can be applied in most real-time data processing applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bitcoin is a distributed digital currency which has attracted a substantial number of users. We perform an in-depth investigation to understand what made Bitcoin so successful, while decades of research on cryptographic e-cash has not lead to a large-scale deployment. We ask also how Bitcoin could become a good candidate for a long-lived stable currency. In doing so, we identify several issues and attacks of Bitcoin, and propose suitable techniques to address them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a technique for delegating a short lattice basis that has the advantage of keeping the lattice dimension unchanged upon delegation. Building on this result, we construct two new hierarchical identity-based encryption (HIBE) schemes, with and without random oracles. The resulting systems are very different from earlier lattice-based HIBEs and in some cases result in shorter ciphertexts and private keys. We prove security from classic lattice hardness assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents ongoing work toward constructing efficient completely non-malleable public-key encryption scheme based on lattices in the standard (common reference string) model. An encryption scheme is completely non-malleable if it requires attackers to have negligible advantage, even if they are allowed to transform the public key under which the related message is encrypted. Ventre and Visconti proposed two inefficient constructions of completely non-malleable schemes, one in the common reference string model using non-interactive zero-knowledge proofs, and another using interactive encryption schemes. Recently, two efficient public-key encryption schemes have been proposed, both of them are based on pairing identity-based encryption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A BPMN model is well-structured if splits and joins are always paired into single-entry-single-exit blocks. Well-structuredness is often a desirable property as it promotes readability and makes models easier to analyze. However, many process models found in practice are not well-structured, and it is not always feasible or even desirable to restrict process modelers to produce only well-structured models. Also, not all processes can be captured as well-structured process models. An alternative to forcing modelers to produce well-structured models, is to automatically transform unstructured models into well-structured ones when needed and possible. This talk reviews existing results on automatic transformation of unstructured process models into structured ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research has established a new privacy framework, privacy model, and privacy architecture to create more transparent privacy for social networking users. The architecture is designed into three levels: Business, Data, and Technology, which is based on The Open Group Architecture Framework (TOGAF®). This framework and architecture provides a novel platform for investigating privacy in Social Networks (SNs). This approach mitigates many current SN privacy issues, and leads to a more controlled form of privacy assessment. Ultimately, more privacy will encourage more connections between people across SN services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose This Study evaluated the predictive validity of three previously published ActiGraph energy expenditure (EE) prediction equations developed for children and adolescents. Methods A total of 45 healthy children and adolescents (mean age: 13.7 +/- 2.6 yr) completed four 5-min activity trials (normal walking. brisk walking, easy running, and fast running) in ail indoor exercise facility. During each trial, participants were all ActiGraph accelerometer oil the right hip. EE was monitored breath by breath using the Cosmed K4b(2) portable indirect calorimetry system. Differences and associations between measured and predicted EE were assessed using dependent t-tests and Pearson correlations, respectively. Classification accuracy was assessed using percent agreement, sensitivity, specificity, and area under the receiver operating characteristic (ROC) curve. Results None of the equations accurately predicted mean energy expenditure during each of the four activity trials. Each equation, however, accurately predicted mean EE in at least one activity trial. The Puyau equation accurately predicted EE during slow walking. The Trost equation accurately predicted EE during slow running. The Freedson equation accurately predicted EE during fast running. None of the three equations accurately predicted EE during brisk walking. The equations exhibited fair to excellent classification accuracy with respect to activity intensity. with the Trost equation exhibiting the highest classification accuracy and the Puyau equation exhibiting the lowest. Conclusions These data suggest that the three accelerometer prediction equations do not accurately predict EE on a minute-by-minute basis in children and adolescents during overground walking and running. The equations maybe, however, for estimating participation in moderate and vigorous activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of global navigation satellite systems (GNSS) provides a solution of many applied problems with increasingly higher quality and accuracy nowadays. Researches that are carried out by the Bavarian Academy of Sciences and Humanities in Munich (BAW) in the field of airborne gravimetry are based on sophisticated data processing from high frequency GNSS receiver for kinematic aircraft positioning. Applied algorithms for inertial acceleration determination are based on the high sampling rate (50Hz) and on reducing of such factors as ionosphere scintillation and multipath at aircraft /antenna near field effects. The quality of the GNSS derived kinematic height are studied also by intercomparison with lift height variations collected by a precise high sampling rate vertical scale [1]. This work is aimed at the ways of more accurate determination of mini-aircraft altitude by means of high frequency GNSS receivers, in particular by considering their dynamic behaviour.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a wide variety of drivers for business process modelling initiatives, reaching from business evolution and process optimisation over compliance checking and process certification to process enactment. That, in turn, results in models that differ in content due to serving different purposes. In particular, processes are modelled on different abstraction levels and assume different perspectives. Vertical alignment of process models aims at handling these deviations. While the advantages of such an alignment for inter-model analysis and change propagation are out of question, a number of challenges has still to be addressed. In this paper, we discuss three main challenges for vertical alignment in detail. Against this background, the potential application of techniques from the field of process integration is critically assessed. Based thereon, we identify specific research questions that guide the design of a framework for model alignment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT A rapidly changing business environment and legacy IT problems has resulted in many organisations implementing standard package solutions. This 'common systems' approach establishes a common IT and business process infrastructure within organisations and its increasing dominance raises several important strategic issues. These are to what extent do common systems impose common business processes and management systems on competing firms, and what is the source of competitive advantage if the majority of firms employ almost identical information systems and business processes? A theoretical framework based on research into legacy systems and earlier IT strategy literature is used to analyse three case studies in the manufacturing, chemical and IT industries. It is shown that the organisations are treating common systems as the core of their organisations' abilities to manage business transactions. To achieve competitive advantage they are clothing these common systems with information systems designed to capture information about competitors, customers and suppliers, and to provide a basis for sharing knowledge within the organisation and ultimately with economic partners. The importance of these approaches to other organisations and industries is analysed and an attempt is made at outlining the strategic options open to firms beyond the implementation of common business systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ambiguity acceptance test is an important quality control procedure in high precision GNSS data processing. Although the ambiguity acceptance test methods have been extensively investigated, its threshold determine method is still not well understood. Currently, the threshold is determined with the empirical approach or the fixed failure rate (FF-) approach. The empirical approach is simple but lacking in theoretical basis, while the FF-approach is theoretical rigorous but computationally demanding. Hence, the key of the threshold determination problem is how to efficiently determine the threshold in a reasonable way. In this study, a new threshold determination method named threshold function method is proposed to reduce the complexity of the FF-approach. The threshold function method simplifies the FF-approach by a modeling procedure and an approximation procedure. The modeling procedure uses a rational function model to describe the relationship between the FF-difference test threshold and the integer least-squares (ILS) success rate. The approximation procedure replaces the ILS success rate with the easy-to-calculate integer bootstrapping (IB) success rate. Corresponding modeling error and approximation error are analysed with simulation data to avoid nuisance biases and unrealistic stochastic model impact. The results indicate the proposed method can greatly simplify the FF-approach without introducing significant modeling error. The threshold function method makes the fixed failure rate threshold determination method feasible for real-time applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy. We present a systematic, practical approach to developing risk prediction systems, suitable for use with large databases of medical information. An important part of this approach is a novel feature selection algorithm which uses the area under the receiver operating characteristic (ROC) curve to measure the expected discriminative power of different sets of predictor variables. We describe this algorithm and use it to select variables to predict risk of a specific adverse pregnancy outcome: failure to progress in labour. Neural network, logistic regression and hierarchical Bayesian risk prediction models are constructed, all of which achieve close to the limit of performance attainable on this prediction task. We show that better prediction performance requires more discriminative clinical information rather than improved modelling techniques. It is also shown that better diagnostic criteria in clinical records would greatly assist the development of systems to predict risk in pregnancy.