441 resultados para Common structures


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The material presented in this thesis may be viewed as comprising two key parts, the first part concerns batch cryptography specifically, whilst the second deals with how this form of cryptography may be applied to security related applications such as electronic cash for improving efficiency of the protocols. The objective of batch cryptography is to devise more efficient primitive cryptographic protocols. In general, these primitives make use of some property such as homomorphism to perform a computationally expensive operation on a collective input set. The idea is to amortise an expensive operation, such as modular exponentiation, over the input. Most of the research work in this field has concentrated on its employment as a batch verifier of digital signatures. It is shown that several new attacks may be launched against these published schemes as some weaknesses are exposed. Another common use of batch cryptography is the simultaneous generation of digital signatures. There is significantly less previous work on this area, and the present schemes have some limited use in practical applications. Several new batch signatures schemes are introduced that improve upon the existing techniques and some practical uses are illustrated. Electronic cash is a technology that demands complex protocols in order to furnish several security properties. These typically include anonymity, traceability of a double spender, and off-line payment features. Presently, the most efficient schemes make use of coin divisibility to withdraw one large financial amount that may be progressively spent with one or more merchants. Several new cash schemes are introduced here that make use of batch cryptography for improving the withdrawal, payment, and deposit of electronic coins. The devised schemes apply both to the batch signature and verification techniques introduced, demonstrating improved performance over the contemporary divisible based structures. The solutions also provide an alternative paradigm for the construction of electronic cash systems. Whilst electronic cash is used as the vehicle for demonstrating the relevance of batch cryptography to security related applications, the applicability of the techniques introduced extends well beyond this.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Localisation of an AUV is challenging and a range of inspection applications require relatively accurate positioning information with respect to submerged structures. We have developed a vision based localisation method that uses a 3D model of the structure to be inspected. The system comprises a monocular vision system, a spotlight and a low-cost IMU. Previous methods that attempt to solve the problem in a similar way try and factor out the effects of lighting. Effects, such as shading on curved surfaces or specular reflections, are heavily dependent on the light direction and are difficult to deal with when using existing techniques. The novelty of our method is that we explicitly model the light source. Results are shown of an implementation on a small AUV in clear water at night.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acoustic emission (AE) technique is a popular tool used for structural health monitoring of civil, mechanical and aerospace structures. It is a non-destructive method based on rapid release of energy within a material by crack initiation or growth in the form of stress waves. Recording of these waves by means of sensors and subsequent analysis of the recorded signals convey information about the nature of the source. Ability to locate the source of stress waves is an important advantage of AE technique; but as AE waves travel in various modes and may undergo mode conversions, understanding of the modes (‘modal analysis’) is often necessary in order to determine source location accurately. This paper presents results of experiments aimed at finding locations of artificial AE sources on a thin plate and identifying wave modes in the recorded signal waveforms. Different source locating techniques will be investigated and importance of wave mode identification will be explored.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Impairments in upper-body function (UBF) are common following breast cancer. However, the relationship between arm morbidity and quality of life (QoL) remains unclear. This investigation uses longitudinal data to describe UBF in a population-based sample of women with breast cancer and examines its relationship with QoL. ---------- Methods: Australian women (n = 287) with unilateral breast cancer were assessed at three-monthly intervals, from six- to 18-months post-surgery (PS). Strength, endurance and flexibility were used to assess objective UBF, while the Disability of the Arm, Shoulder and Hand questionnaire and the Functional Assessment of Cancer Therapy- Breast questionnaire were used to assess self-reported UBF and QoL, respectively. ---------- Results: Although mean UBF improved over time, up to 41% of women revealed declines in UBF between sixand 18-months PS. Older age, lower socioeconomic position, treatment on the dominant side, mastectomy, more extensive lymph node removal and having lymphoedema each increased odds of declines in UBF by at least twofold (p < 0.05). Lower baseline and declines in perceived UBF between six- and 18-months PS were each associated with poorer QoL at 18-months PS (p < 0.05). ---------- Conclusions: Significant upper-body morbidity is experienced by many following breast cancer treatment, persisting longer term, and adversely influencing the QoL of breast cancer survivors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the Cooroy Mill community precinct (Sunshine Coast, Queensland), as a case study, seeking to understand the way local dynamics interplay and work with the community strengths to build a governance model of best fit. As we move to an age of ubiquitous computing and creative economies, the definition of public place and its governance take on new dimensions, which – while often utilizing models of the past – will need to acknowledge and change to the direction of the future. This paper considers a newly developed community precinct that has been built on three key principles: to foster creative expression with new media, to establish a knowledge economy in a regional area, and to subscribe to principles of community engagement. The study involved qualitative interviews with key stakeholders and a review of common practice models of governance along a spectrum from community control to state control. The paper concludes with a call for governance structures that are locally situated and tailored, inclusive, engaging, dynamic and flexible in order to build community capacity, encourage creativity, and build knowledge economies within emerging digital media cityscapes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rapid prototyping (RP) is a common name for several techniques, which read in data from computer-aided design (CAD) drawings and manufacture automatically threedimensional objects layer-by-layer according to the virtual design. The utilization of RP in tissue engineering enables the production of three-dimensional scaffolds with complex geometries and very fine structures. Adding micro- and nanometer details into the scaffolds improves the mechanical properties of the scaffold and ensures better cell adhesion to the scaffold surface. Thus, tissue engineering constructs can be customized according to the data acquired from the medical scans to match the each patient’s individual needs. In addition RP enables the control of the scaffold porosity making it possible to fabricate applications with desired structural integrity. Unfortunately, every RP process has its own unique disadvantages in building tissue engineering scaffolds. Hence, the future research should be focused into the development of RP machines designed specifically for fabrication of tissue engineering scaffolds, although RP methods already can serve as a link between tissue and engineering.