335 resultados para Scintillator applications
Resumo:
In this chapter we continue the exposition of crypto topics that was begun in the previous chapter. This chapter covers secret sharing, threshold cryptography, signature schemes, and finally quantum key distribution and quantum cryptography. As in the previous chapter, we have focused only on the essentials of each topic. We have selected in the bibliography a list of representative items, which can be consulted for further details. First we give a synopsis of the topics that are discussed in this chapter. Secret sharing is concerned with the problem of how to distribute a secret among a group of participating individuals, or entities, so that only predesignated collections of individuals are able to recreate the secret by collectively combining the parts of the secret that were allocated to them. There are numerous applications of secret-sharing schemes in practice. One example of secret sharing occurs in banking. For instance, the combination to a vault may be distributed in such a way that only specified collections of employees can open the vault by pooling their portions of the combination. In this way the authority to initiate an action, e.g., the opening of a bank vault, is divided for the purposes of providing security and for added functionality, such as auditing, if required. Threshold cryptography is a relatively recently studied area of cryptography. It deals with situations where the authority to initiate or perform cryptographic operations is distributed among a group of individuals. Many of the standard operations of single-user cryptography have counterparts in threshold cryptography. Signature schemes deal with the problem of generating and verifying electronic) signatures for documents.Asubclass of signature schemes is concerned with the shared-generation and the sharedverification of signatures, where a collaborating group of individuals are required to perform these actions. A new paradigm of security has recently been introduced into cryptography with the emergence of the ideas of quantum key distribution and quantum cryptography. While classical cryptography employs various mathematical techniques to restrict eavesdroppers from learning the contents of encrypted messages, in quantum cryptography the information is protected by the laws of physics.
Resumo:
This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.
Resumo:
Fungi are eukaryotic organisms and considered to be less adaptable to extreme environments when compared to bacteria. While there are no thermophilic microfungi in a strict sense, some fungi have adapted to life in the cold. Cold-active microfungi have been isolated from the Antarctic and their enzyme activities explored with a view to finding new candidates for industrial use. On another front, environmental pollution by petroleum products in the Antarctic has led to a search for, and the subsequent discovery of, fungal isolates capable of degrading hydrocarbons. The work has paved the way to developing a bioremedial approach to containing this type of contamination in cold climates. Here we discuss our efforts to map the capability of Antarctic microfungi to degrade oil and also introduce a novel cold-active fungal lipase enzyme.
Resumo:
Background Kiwifruit (Actinidia spp.) are a relatively new, but economically important crop grown in many different parts of the world. Commercial success is driven by the development of new cultivars with novel consumer traits including flavor, appearance, healthful components and convenience. To increase our understanding of the genetic diversity and gene-based control of these key traits in Actinidia, we have produced a collection of 132,577 expressed sequence tags (ESTs). Results The ESTs were derived mainly from four Actinidia species (A. chinensis, A. deliciosa, A. arguta and A. eriantha) and fell into 41,858 non redundant clusters (18,070 tentative consensus sequences and 23,788 EST singletons). Analysis of flavor and fragrance-related gene families (acyltransferases and carboxylesterases) and pathways (terpenoid biosynthesis) is presented in comparison with a chemical analysis of the compounds present in Actinidia including esters, acids, alcohols and terpenes. ESTs are identified for most genes in color pathways controlling chlorophyll degradation and carotenoid biosynthesis. In the health area, data are presented on the ESTs involved in ascorbic acid and quinic acid biosynthesis showing not only that genes for many of the steps in these pathways are represented in the database, but that genes encoding some critical steps are absent. In the convenience area, genes related to different stages of fruit softening are identified. Conclusion This large EST resource will allow researchers to undertake the tremendous challenge of understanding the molecular basis of genetic diversity in the Actinidia genus as well as provide an EST resource for comparative fruit genomics. The various bioinformatics analyses we have undertaken demonstrates the extent of coverage of ESTs for genes encoding different biochemical pathways in Actinidia.
Resumo:
This thesis developed new search engine models that elicit the meaning behind the words found in documents and queries, rather than simply matching keywords. These new models were applied to searching medical records: an area where search is particularly challenging yet can have significant benefits to our society.
Resumo:
Introduction Since 1992 there have been several articles published on research on plastic scintillators for use in radiotherapy. Plastic scintillators are said to be tissue equivalent, temperature independent and dose rate independent [1]. Although their properties were found to be promising for measurements in megavoltage X-ray beams there were some technical difficulties with regards to its commercialisation. Standard Imaging has produced the first commercial system which is now available for use in a clinical setting. The Exradin W1 scintillator device uses a dual fibre system where one fibre is connected to the Plastic Scintillator and the other fibre only measures Cerenkov radiation [2]. This paper presents results obtained during commissioning of this dosimeter system. Methods All tests were performed on a Novalis Tx linear accelerator equipped with a 6 MV SRS photon beam and conventional 6 and 18 MV X-ray beams. The following measurements were performed in a Virtual Water phantom at a depth of dose maximum. Linearity: The dose delivered was varied between 0.2 and 3.0 Gy for the same field conditions. Dose rate dependence: For this test the repetition rate of the linac was varied between 100 and 1,000 MU/min. A nominal dose of 1.0 Gy was delivered for each rate. Reproducibility: A total of five irradiations for the same setup. Results The W1 detector gave a highly linear relationship between dose and the number of Monitor Units delivered for a 10 9 10 cm2 field size at a SSD of 100 cm. The linearity was within 1 % for the high dose end and about 2 % for the very low dose end. For the dose rate dependence, the dose measured as a function of repetition the rate (100–1,000 MU/min) gave a maximum deviation of 0.9 %. The reproducibility was found to be better than 0.5 %. Discussion and conclusions The results for this system look promising so far being a new dosimetry system available for clinical use. However, further investigation is needed to produce a full characterisation prior to use in megavoltage X-ray beams.
Resumo:
Smartphone technology provides free or inexpensive access to mental health and wellbeing resources. As a result the use of mobile applications for these purposes has increased significantly in recent years. Yet, there is currently no app quality assessment alternative to the popular ‘star’-ratings, which are often unreliable. This presentation describes the development of the Mobile Application Rating Scale (MARS) a new measure for classifying and rating the quality of mobile applications. A review of existing literature on app and web quality identified 25 published papers, conference proceedings, and online resources (published since 1999), which identified 372 explicit quality criteria. Qualitative analysis identified five broad categories of app quality rating criteria: engagement, functionality, aesthetics, information quality, and overall satisfaction, which were refined into the 23-item MARS. Independent ratings of 50 randomly selected mental health and wellbeing mobile apps indicated the MARS had excellent levels of internal consistency (α = 0.92) and inter-rater reliability (ICC = 0.85). The MARS provides practitioners and researchers with an easy-to-use, simple, objective and reliable tool for assessing mobile app quality. It also provides mHealth professionals with a checklist for the design and development of high quality apps.
Resumo:
This thesis investigates the fusion of 3D visual information with 2D image cues to provide 3D semantic maps of large-scale environments in which a robot traverses for robotic applications. A major theme of this thesis was to exploit the availability of 3D information acquired from robot sensors to improve upon 2D object classification alone. The proposed methods have been evaluated on several indoor and outdoor datasets collected from mobile robotic platforms including a quadcopter and ground vehicle covering several kilometres of urban roads.
Resumo:
Purpose The purpose of this study was to evaluate the validity of the CSA activity monitor as a measure of children's physical activity using energy expenditure (EE) as a criterion measure. Methods Thirty subjects aged 10 to 14 performed three 5-min treadmill bouts at 3, 4, and 6 mph, respectively. While on the treadmill, subjects wore CSA (WAM 7164) activity monitors on the right and left hips. (V) over dot O-2 was monitored continuously by an automated system. EE was determined by multiplying the average (V) over dot O-2 by the caloric equivalent of the mean respiratory exchange ratio. Results Repeated measures ANOVA indicated that both CSA monitors were sensitive to changes in treadmill speed. Mean activity counts from each CSA unit were not significantly different and the intraclass reliability coefficient for the two CSA units across all speeds was 0.87. Activity counts from both CSA units were strongly correlated with EE (r = 0.86 and 0.87, P < 0.001). An EE prediction equation was developed from 20 randomly selected subjects and cross-validated on the remaining 10. The equation predicted mean EE within 0.01 kcal.min(-1). The correlation between actual and predicted values was 0.93 (P < 0.01) and the SEE was 0.93 kcal.min(-1). Conclusion These data indicate that the CSA monitor is a valid and reliable tool for quantifying treadmill walking and running in children.
Resumo:
The continuum model is a key paradigm describing the behavior of electromechanical transients in power systems. In the past two decades, much research work has been done on applying the continuum model to analyze the electromechanical wave in power systems. In this work, the uniform and non-uniform continuum models are first briefly described, and some explanations borrowing concepts and tools from other fields are given. Then, the existing approaches of investigating the resulting wave equations are summarized. An application named the zero reflection controller based on the idea of the wave equations is next presented.
Resumo:
This project is a breakthrough in developing new scientific approaches for the design, development and evaluation of inter-vehicle communications, networking and positioning systems as part of Cooperative Intelligent Transportation Systems ensuring the safety of both roads and rail networks. This research focused on the elicitation, specification, analysis and validation of requirements for Vehicle-to-Vehicle communications and networking, and Vehicle-to-Vehicle positioning, which are accomplished with the research platform developed for this study. A number of mathematical models for communications, networking and positioning were developed from which simulations and field experiments were conducted to evaluate the overall performance of the platform. The outcomes of this research significantly contribute to improving the performance of the communications and positioning components of Cooperative Intelligent Transportation Systems.
Resumo:
This thesis is a study of new design methods for allowing evolutionary algorithms to be more effectively utilised in aerospace optimisation applications where computation needs are high and computation platform space may be restrictive. It examines the applicability of special hardware computational platforms known as field programmable gate arrays and shows that with the right implementation methods they can offer significant benefits. This research is a step forward towards the advancement of efficient and highly automated aircraft systems for meeting compact physical constraints in aerospace platforms and providing effective performance speedups over traditional methods.