547 resultados para least common subgraph algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stream ciphers are encryption algorithms used for ensuring the privacy of digital telecommunications. They have been widely used for encrypting military communications, satellite communications, pay TV encryption and for voice encryption of both fixed lined and wireless networks. The current multi year European project eSTREAM, which aims to select stream ciphers suitable for widespread adoptation, reflects the importance of this area of research. Stream ciphers consist of a keystream generator and an output function. Keystream generators produce a sequence that appears to be random, which is combined with the plaintext message using the output function. Most commonly, the output function is binary addition modulo two. Cryptanalysis of these ciphers focuses largely on analysis of the keystream generators and of relationships between the generator and the keystream it produces. Linear feedback shift registers are widely used components in building keystream generators, as the sequences they produce are well understood. Many types of attack have been proposed for breaking various LFSR based stream ciphers. A recent attack type is known as an algebraic attack. Algebraic attacks transform the problem of recovering the key into a problem of solving multivariate system of equations, which eventually recover the internal state bits or the key bits. This type of attack has been shown to be effective on a number of regularly clocked LFSR based stream ciphers. In this thesis, algebraic attacks are extended to a number of well known stream ciphers where at least one LFSR in the system is irregularly clocked. Applying algebriac attacks to these ciphers has only been discussed previously in the open literature for LILI-128. In this thesis, algebraic attacks are first applied to keystream generators using stop-and go clocking. Four ciphers belonging to this group are investigated: the Beth-Piper stop-and-go generator, the alternating step generator, the Gollmann cascade generator and the eSTREAM candidate: the Pomaranch cipher. It is shown that algebraic attacks are very effective on the first three of these ciphers. Although no effective algebraic attack was found for Pomaranch, the algebraic analysis lead to some interesting findings including weaknesses that may be exploited in future attacks. Algebraic attacks are then applied to keystream generators using (p; q) clocking. Two well known examples of such ciphers, the step1/step2 generator and the self decimated generator are investigated. Algebraic attacks are shown to be very powerful attack in recovering the internal state of these generators. A more complex clocking mechanism than either stop-and-go or the (p; q) clocking keystream generators is known as mutual clock control. In mutual clock control generators, the LFSRs control the clocking of each other. Four well known stream ciphers belonging to this group are investigated with respect to algebraic attacks: the Bilateral-stop-and-go generator, A5/1 stream cipher, Alpha 1 stream cipher, and the more recent eSTREAM proposal, the MICKEY stream ciphers. Some theoretical results with regards to the complexity of algebraic attacks on these ciphers are presented. The algebraic analysis of these ciphers showed that generally, it is hard to generate the system of equations required for an algebraic attack on these ciphers. As the algebraic attack could not be applied directly on these ciphers, a different approach was used, namely guessing some bits of the internal state, in order to reduce the degree of the equations. Finally, an algebraic attack on Alpha 1 that requires only 128 bits of keystream to recover the 128 internal state bits is presented. An essential process associated with stream cipher proposals is key initialization. Many recently proposed stream ciphers use an algorithm to initialize the large internal state with a smaller key and possibly publicly known initialization vectors. The effect of key initialization on the performance of algebraic attacks is also investigated in this thesis. The relationships between the two have not been investigated before in the open literature. The investigation is conducted on Trivium and Grain-128, two eSTREAM ciphers. It is shown that the key initialization process has an effect on the success of algebraic attacks, unlike other conventional attacks. In particular, the key initialization process allows an attacker to firstly generate a small number of equations of low degree and then perform an algebraic attack using multiple keystreams. The effect of the number of iterations performed during key initialization is investigated. It is shown that both the number of iterations and the maximum number of initialization vectors to be used with one key should be carefully chosen. Some experimental results on Trivium and Grain-128 are then presented. Finally, the security with respect to algebraic attacks of the well known LILI family of stream ciphers, including the unbroken LILI-II, is investigated. These are irregularly clock- controlled nonlinear filtered generators. While the structure is defined for the LILI family, a particular paramater choice defines a specific instance. Two well known such instances are LILI-128 and LILI-II. The security of these and other instances is investigated to identify which instances are vulnerable to algebraic attacks. The feasibility of recovering the key bits using algebraic attacks is then investigated for both LILI- 128 and LILI-II. Algebraic attacks which recover the internal state with less effort than exhaustive key search are possible for LILI-128 but not for LILI-II. Given the internal state at some point in time, the feasibility of recovering the key bits is also investigated, showing that the parameters used in the key initialization process, if poorly chosen, can lead to a key recovery using algebraic attacks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acute lower respiratory tract infections (ALRTIs) are a common cause of morbidity and mortality among children under 5 years of age and are found worldwide, with pneumonia as the most severe manifestation. Although the incidence of severe disease varies both between individuals and countries, there is still no clear understanding of what causes this variation. Studies of community-acquired pneumonia (CAP) have traditionally not focused on viral causes of disease due to a paucity of diagnostic tools. However, with the emergence of molecular techniques, it is now known that viruses outnumber bacteria as the etiological agents of childhood CAP, especially in children under 2 years of age. The main objective of this study was to investigate viruses contributing to disease severity in cases of childhood ALRTI, using a two year cohort study following 2014 infants and children enrolled in Bandung, Indonesia. A total of 352 nasopharyngeal washes collected from 256 paediatric ALRTI patients were used for analysis. A subset of samples was screened using a novel microarray pathogen detection method that identified respiratory syncytial virus (RSV), human metapneumovirus (hMPV) and human rhinovirus (HRV) in the samples. Real-time RT-PCR was used both for confirming and quantifying viruses found in the nasopharyngeal samples. Viral copy numbers were determined and normalised to the numbers of human cells collected with the use of 18S rRNA. Molecular epidemiology was performed for RSV A and hMPV using sequences to the glycoprotein gene and nucleoprotein gene respectively, to determine genotypes circulating in this Indonesian paediatric cohort. This study found that HRV (119/352; 33.8%) was the most common virus detected as the cause of respiratory tract infections in this cohort, followed by the viral pathogens RSV A (73/352; 20.7%), hMPV (30/352; 8.5%) and RSV B (12/352; 3.4%). Co-infections of more than two viruses were detected in 31 episodes (defined as an infection which occurred more than two weeks apart), accounting for 8.8% of the 352 samples tested or 15.4% of the 201 episodes with at least one virus detected. RSV A genotypes circulating in this population were predominantly GA2, GA5 and GA7, while hMPV genotypes circulating were mainly A2a (27/30; 90.0%), B2 (2/30; 6.7%) and A1 (1/30; 3.3%). This study found no evidence of disease severity associated either with a specific virus or viral strain, or with viral load. However, this study did find a significant association with co-infection of RSV A and HRV with severe disease (P = 0.006), suggesting that this may be a novel cause of severe disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When the supply voltages are balanced and sinusoidal, load compensation can give both unity power factor (UPF) and perfect harmonic cancellation (PHC) source currents. But under distorted supply voltages, achieving both UPF and PHC currents are not possible and contradictory to each other. Hence there should be an optimal performance between these two important compensation goals. This paper presents an optimal control algorithm for load compensation under unbalanced and distorted supply voltages. In this algorithm source currents are compensated for reactive, imbalance components and harmonic distortions set by the limits. By satisfying the harmonic distortion limits and power balance, this algorithm gives the source currents which will provide the maximum achievable power factor. The detailed simulation results using MATLAB are presented to support the performance of the proposed optimal control algorithm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Articles > Journals > Health journals > Nutrition & Dietetics: The Journal of the Dieticians Association of Australia articles > March 2003 Article: An assessment of the potential of Family Day Care as a nutrition promotion setting in South Australia. (Original Research). Article from:Nutrition & Dietetics: The Journal of the Dieticians Association of Australia Article date:March 1, 2003 Author:Daniels, Lynne A.; Franco, Bunny; McWhinnie, Julie-Anne CopyrightCOPYRIGHT 2006 Dietitians Association of Australia. This material is published under license from the publisher through the Gale Group, Farmington Hills, Michigan. All inquiries regarding rights or concerns about this content should be directed to customer service. (Hide copyright information) Related articles Ads by Google TAFE Child Care Courses Government accredited courses. Study anytime, anywhere. www.seeklearning.com.au Get Work in Child Care Certificate III Children's Services 4 Day Course + Take Home Assessment HBAconsult.com.au Abstract Objective: To assess the potential role of Family Day Care in nutrition promotion for preschool children. Design and setting: A questionnaire to examine nutrition-related issues and practices was mailed to care providers registered in the southern region of Adelaide, South Australia. Care providers also supplied a descriptive, qualitative recall of the food provided by parents or themselves to each child less than five years of age in their care on the day closest to completion of the questionnaire. Subjects: 255 care providers. The response rate was 63% and covered 643 preschool children, mean 4.6 (SD 2.8) children per carer. Results: There was clear agreement that nutrition promotion was a relevant issue for Family Day Care providers. Nutrition and food hygiene knowledge was good but only 54% of respondents felt confident to address food quality issues with parents. Sixty-five percent of respondents reported non-neutral approaches to food refusal and dawdling (reward, punishment, cajoling) that overrode the child's control of the amount eaten. The food recalls indicated that most children (> 75%) were offered fruit at least once. Depending on the hours in care, (0 to 4, 5 to 8, greater than 8 hours), 20%, 32% and 55%, respectively, of children were offered milk and 65%, 82% and 87%, respectively, of children were offered high fat and sugar foods. Conclusions: Questionnaire responses suggest that many care providers are committed to and proactive in a range of nutrition promotion activities. There is scope for strengthening skills in the management of common problems, such as food refusal and dawdling, consistent with the current evidence for approaches to early feeding management that promote the development of healthy food preferences and eating patterns. Legitimising and empowering care providers in their nutrition promotion role requires clear policies, guide lines, adequate pre- and in-service training, suitable parent materials, and monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The task addressed in this thesis is the automatic alignment of an ensemble of misaligned images in an unsupervised manner. This application is especially useful in computer vision applications where annotations of the shape of an object of interest present in a collection of images is required. Performing this task manually is a slow, tedious, expensive and error prone process which hinders the progress of research laboratories and businesses. Most recently, the unsupervised removal of geometric variation present in a collection of images has been referred to as congealing based on the seminal work of Learned-Miller [21]. The only assumption made in congealing is that the parametric nature of the misalignment is known a priori (e.g. translation, similarity, a�ne, etc) and that the object of interest is guaranteed to be present in each image. The capability to congeal an ensemble of misaligned images stemming from the same object class has numerous applications in object recognition, detection and tracking. This thesis concerns itself with the construction of a congealing algorithm titled, least-squares congealing, which is inspired by the well known image to image alignment algorithm developed by Lucas and Kanade [24]. The algorithm is shown to have superior performance characteristics when compared to previously established methods: canonical congealing by Learned-Miller [21] and stochastic congealing by Z�ollei [39].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One major gap in transportation system safety management is the ability to assess the safety ramifications of design changes for both new road projects and modifications to existing roads. To fulfill this need, FHWA and its many partners are developing a safety forecasting tool, the Interactive Highway Safety Design Model (IHSDM). The tool will be used by roadway design engineers, safety analysts, and planners throughout the United States. As such, the statistical models embedded in IHSDM will need to be able to forecast safety impacts under a wide range of roadway configurations and environmental conditions for a wide range of driver populations and will need to be able to capture elements of driving risk across states. One of the IHSDM algorithms developed by FHWA and its contractors is for forecasting accidents on rural road segments and rural intersections. The methodological approach is to use predictive models for specific base conditions, with traffic volume information as the sole explanatory variable for crashes, and then to apply regional or state calibration factors and accident modification factors (AMFs) to estimate the impact on accidents of geometric characteristics that differ from the base model conditions. In the majority of past approaches, AMFs are derived from parameter estimates associated with the explanatory variables. A recent study for FHWA used a multistate database to examine in detail the use of the algorithm with the base model-AMF approach and explored alternative base model forms as well as the use of full models that included nontraffic-related variables and other approaches to estimate AMFs. That research effort is reported. The results support the IHSDM methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most statistical methods use hypothesis testing. Analysis of variance, regression, discrete choice models, contingency tables, and other analysis methods commonly used in transportation research share hypothesis testing as the means of making inferences about the population of interest. Despite the fact that hypothesis testing has been a cornerstone of empirical research for many years, various aspects of hypothesis tests commonly are incorrectly applied, misinterpreted, and ignored—by novices and expert researchers alike. On initial glance, hypothesis testing appears straightforward: develop the null and alternative hypotheses, compute the test statistic to compare to a standard distribution, estimate the probability of rejecting the null hypothesis, and then make claims about the importance of the finding. This is an oversimplification of the process of hypothesis testing. Hypothesis testing as applied in empirical research is examined here. The reader is assumed to have a basic knowledge of the role of hypothesis testing in various statistical methods. Through the use of an example, the mechanics of hypothesis testing is first reviewed. Then, five precautions surrounding the use and interpretation of hypothesis tests are developed; examples of each are provided to demonstrate how errors are made, and solutions are identified so similar errors can be avoided. Remedies are provided for common errors, and conclusions are drawn on how to use the results of this paper to improve the conduct of empirical research in transportation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital collections are growing exponentially in size as the information age takes a firm grip on all aspects of society. As a result Information Retrieval (IR) has become an increasingly important area of research. It promises to provide new and more effective ways for users to find information relevant to their search intentions. Document clustering is one of the many tools in the IR toolbox and is far from being perfected. It groups documents that share common features. This grouping allows a user to quickly identify relevant information. If these groups are misleading then valuable information can accidentally be ignored. There- fore, the study and analysis of the quality of document clustering is important. With more and more digital information available, the performance of these algorithms is also of interest. An algorithm with a time complexity of O(n2) can quickly become impractical when clustering a corpus containing millions of documents. Therefore, the investigation of algorithms and data structures to perform clustering in an efficient manner is vital to its success as an IR tool. Document classification is another tool frequently used in the IR field. It predicts categories of new documents based on an existing database of (doc- ument, category) pairs. Support Vector Machines (SVM) have been found to be effective when classifying text documents. As the algorithms for classifica- tion are both efficient and of high quality, the largest gains can be made from improvements to representation. Document representations are vital for both clustering and classification. Representations exploit the content and structure of documents. Dimensionality reduction can improve the effectiveness of existing representations in terms of quality and run-time performance. Research into these areas is another way to improve the efficiency and quality of clustering and classification results. Evaluating document clustering is a difficult task. Intrinsic measures of quality such as distortion only indicate how well an algorithm minimised a sim- ilarity function in a particular vector space. Intrinsic comparisons are inherently limited by the given representation and are not comparable between different representations. Extrinsic measures of quality compare a clustering solution to a “ground truth” solution. This allows comparison between different approaches. As the “ground truth” is created by humans it can suffer from the fact that not every human interprets a topic in the same manner. Whether a document belongs to a particular topic or not can be subjective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: Flood is the most common natural disaster in Australia and causes more loss of life than any other disaster. This article describes the incidence and causes of deaths directly associated with floods in contemporary Australia. ---------- Methods: The present study compiled a database of flood fatalities in Australia in the period of 1997–2008 inclusive. The data were derived from newspapers and historic accounts, as well as government and scientific reports. Assembled data include the date and location of fatalities, age and gender of victims and the circumstances of the death. ---------- Results: At least 73 persons died as a direct result of floods in Australia in the period of 1997–2008. The largest number of fatalities occurred in New South Wales and Queensland. Most fatalities occurred during February, and among men (71.2%). People between the ages of 10 and 29 and those over 70 years are overrepresented among those drowned. There is no evident decline in the number of deaths over time. 48.5% fatalities related to motor vehicle use. 26.5% fatalities occurred as a result of inappropriate or high-risk behaviour during floods. ---------- Conclusion: In modern developed countries with adequate emergency response systems and extensive resources, deaths that occur in floods are almost all eminently preventable. Over 90% of the deaths are caused by attempts to ford flooded waterways or inappropriate situational conduct. Knowledge of the leading causes of flood fatalities should inform public awareness programmes and public safety police enforcement activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the optimization of conductor size and the voltage regulator location & magnitude of long rural distribution lines. The optimization minimizes the lifetime cost of the lines, including capital costs and losses while observing voltage drop and operational constraints using a Genetic Algorithm (GA). The GA optimization is applied to a real Single Wire Earth Return (SWER) network in regional Queensland and results are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical modeling method to accurately determine combustion chamber resonance is proposed and demonstrated. This method utilises Markov-chain Monte Carlo (MCMC) through the use of the Metropolis-Hastings (MH) algorithm to yield a probability density function for the combustion chamber frequency and find the best estimate of the resonant frequency, along with uncertainty. The accurate determination of combustion chamber resonance is then used to investigate various engine phenomena, with appropriate uncertainty, for a range of engine cycles. It is shown that, when operating on various ethanol/diesel fuel combinations, a 20% substitution yields the least amount of inter-cycle variability, in relation to combustion chamber resonance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This investigation describes the prevalence of upper-body symptoms in a population-based sample of women with breast cancer (BC) and examines their relationships with upper-body function (UBF) and lymphoedema, as two clinically important sequelae. Australian women (n=287) with unilateral BC were assessed at three-monthly intervals, from six to 18 months post-surgery (PS). Participants reported the presence and intensity of upper-body symptoms on the treated side. Objective and self-reported UBF and lymphoedema (bioimpedance spectroscopy) were also assessed. Approximately 50% of women reported at least one moderate-to-extreme symptom at 6- and at 18-months PS. There was a significant relationship between symptoms and function (p<0.01), whereby perceived and objective function declined with increasing number of symptoms present. Those with lymphoedema were more likely to report multiple symptoms and presence of symptoms at baseline increased risk of lymphoedema (ORs>1.3, p=0.02). Although, presence of symptoms explained only 5.5% of the variation in the odds of lymphoedema. Upper-body symptoms are common and persistent following breast cancer and are associated with clinical ramifications, including reduced UBF and increased risk of developing lymphoedema. However, using the presence of symptoms as a diagnostic indicator of lymphoedema is limited.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The railway service is now the major transportation means in most of the countries around the world. With the increasing population and expanding commercial and industrial activities, a high quality of railway service is the most desirable. Train service usually varies with the population activities throughout a day and train coordination and service regulation are then expected to meet the daily passengers' demand. Dwell time control at stations and fixed coasting point in an inter-station run are the current practices to regulate train service in most metro railway systems. However, a flexible and efficient train control and operation is not always possible. To minimize energy consumption of train operation and make certain compromises on the train schedule, coast control is an economical approach to balance run-time and energy consumption in railway operation if time is not an important issue, particularly at off-peak hours. The capability to identify the starting point for coasting according to the current traffic conditions provides the necessary flexibility for train operation. This paper presents an application of genetic algorithms (GA) to search for the appropriate coasting point(s) and investigates the possible improvement on fitness of genes. Single and multiple coasting point control with simple GA are developed to attain the solutions and their corresponding train movement is examined. Further, a hierarchical genetic algorithm (HGA) is introduced here to identify the number of coasting points required according to the traffic conditions, and Minimum-Allele-Reserve-Keeper (MARK) is adopted as a genetic operator to achieve fitter solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In general, simple and traditional methods are applied to resolve traffic conflicts at railway junctions. They are, however, either inefficient or computationally demanding. A simple genetic algorithm is presented to enable a search for a near optimal resolution to be carried out while meeting the constraints on generation evolution and minimising the search time.