821 resultados para fuzzy vault, multiple biometrics, biometric cryptosystem, biometrics and cryptography


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Preface The 9th Australasian Conference on Information Security and Privacy (ACISP 2004) was held in Sydney, 13–15 July, 2004. The conference was sponsored by the Centre for Advanced Computing – Algorithms and Cryptography (ACAC), Information and Networked Security Systems Research (INSS), Macquarie University and the Australian Computer Society. The aims of the conference are to bring together researchers and practitioners working in areas of information security and privacy from universities, industry and government sectors. The conference program covered a range of aspects including cryptography, cryptanalysis, systems and network security. The program committee accepted 41 papers from 195 submissions. The reviewing process took six weeks and each paper was carefully evaluated by at least three members of the program committee. We appreciate the hard work of the members of the program committee and external referees who gave many hours of their valuable time. Of the accepted papers, there were nine from Korea, six from Australia, five each from Japan and the USA, three each from China and Singapore, two each from Canada and Switzerland, and one each from Belgium, France, Germany, Taiwan, The Netherlands and the UK. All the authors, whether or not their papers were accepted, made valued contributions to the conference. In addition to the contributed papers, Dr Arjen Lenstra gave an invited talk, entitled Likely and Unlikely Progress in Factoring. This year the program committee introduced the Best Student Paper Award. The winner of the prize for the Best Student Paper was Yan-Cheng Chang from Harvard University for his paper Single Database Private Information Retrieval with Logarithmic Communication. We would like to thank all the people involved in organizing this conference. In particular we would like to thank members of the organizing committee for their time and efforts, Andrina Brennan, Vijayakrishnan Pasupathinathan, Hartono Kurnio, Cecily Lenton, and members from ACAC and INSS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Imaging genetics aims to discover how variants in the human genome influence brain measures derived from images. Genome-wide association scans (GWAS) can screen the genome for common differences in our DNA that relate to brain measures. In small samples, GWAS has low power as individual gene effects are weak and one must also correct for multiple comparisons across the genome and the image. Here we extend recent work on genetic clustering of images, to analyze surface-based models of anatomy using GWAS. We performed spherical harmonic analysis of hippocampal surfaces, automatically extracted from brain MRI scans of 1254 subjects. We clustered hippocampal surface regions with common genetic influences by examining genetic correlations (r(g)) between the normalized deformation values at all pairs of surface points. Using genetic correlations to cluster surface measures, we were able to boost effect sizes for genetic associations, compared to clustering with traditional phenotypic correlations using Pearson's r.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fractional anisotropy (FA), a very widely used measure of fiber integrity based on diffusion tensor imaging (DTI), is a problematic concept as it is influenced by several quantities including the number of dominant fiber directions within each voxel, each fiber's anisotropy, and partial volume effects from neighboring gray matter. With High-angular resolution diffusion imaging (HARDI) and the tensor distribution function (TDF), one can reconstruct multiple underlying fibers per voxel and their individual anisotropy measures by representing the diffusion profile as a probabilistic mixture of tensors. We found that FA, when compared with TDF-derived anisotropy measures, correlates poorly with individual fiber anisotropy, and may sub-optimally detect disease processes that affect myelination. By contrast, mean diffusivity (MD) as defined in standard DTI appears to be more accurate. Overall, we argue that novel measures derived from the TDF approach may yield more sensitive and accurate information than DTI-derived measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This constructivist grounded theory study investigated the nature of new mothers' information experience in social media. The findings provide a holistic view of the phenomenon and the resultant substantive grounded theory describes new mothers' information experience in social media as a complex, multi-layered, and highly contextualised phenomenon. It encapsulates multiple individual experiences of information, and is broader and deeper than the individual experiences it is comprised of. The theory incorporates the characteristics, dimensions and categories of experience to provide a holistic view of new mothers' information experience in social media.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new, generic method/model for multi-objective design optimization of laminated composite components using a novel multi-objective optimization algorithm developed on the basis of the Quantum behaved Particle Swarm Optimization (QPSO) paradigm. QPSO is a co-variant of the popular Particle Swarm Optimization (PSO) and has been developed and implemented successfully for the multi-objective design optimization of composites. The problem is formulated with multiple objectives of minimizing weight and the total cost of the composite component to achieve a specified strength. The primary optimization variables are - the number of layers, its stacking sequence (the orientation of the layers) and thickness of each layer. The classical lamination theory is utilized to determine the stresses in the component and the design is evaluated based on three failure criteria; Failure Mechanism based Failure criteria, Maximum stress failure criteria and the Tsai-Wu Failure criteria. The optimization method is validated for a number of different loading configurations - uniaxial, biaxial and bending loads. The design optimization has been carried for both variable stacking sequences as well as fixed standard stacking schemes and a comparative study of the different design configurations evolved has been presented. Also, the performance of QPSO is compared with the conventional PSO.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Genetic models partitioning additive and non-additive genetic effects for populations tested in replicated multi-environment trials (METs) in a plant breeding program have recently been presented in the literature. For these data, the variance model involves the direct product of a large numerator relationship matrix A, and a complex structure for the genotype by environment interaction effects, generally of a factor analytic (FA) form. With MET data, we expect a high correlation in genotype rankings between environments, leading to non-positive definite covariance matrices. Estimation methods for reduced rank models have been derived for the FA formulation with independent genotypes, and we employ these estimation methods for the more complex case involving the numerator relationship matrix. We examine the performance of differing genetic models for MET data with an embedded pedigree structure, and consider the magnitude of the non-additive variance. The capacity of existing software packages to fit these complex models is largely due to the use of the sparse matrix methodology and the average information algorithm. Here, we present an extension to the standard formulation necessary for estimation with a factor analytic structure across multiple environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Higher education is a powerful tool for reducing social and economic disadvantage. But access to higher education can be difficult, particularly for Indigenous Australians who face multiple levels of social, economic and geographical isolation. While enabling programs can support Indigenous students to gain university entry, the experience at Central Queensland University (CQUniversity) suggests that their past success has been limited. In this paper, the authors describe the enabling program available to Indigenous students at CQUniversity. They suggest that the newly developed, flexible, online version of the program is helping to address geographical and social isolation and improve successful outcomes for Indigenous Australians.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effectiveness evaluation of aerospace fault-tolerant computing systems used in a phased-mission environment is rather tricky and difficult because of the interaction of its several degraded performance levels with the multiple objectives of the mission and the use environment. Part I uses an approach based on multiobjective phased-mission analysis to evaluate the effectiveness of a distributed avionics architecture used in a transport aircraft. Part II views the computing system as a multistate s-coherent structure. Lower bounds on the probabilities of accomplishing various levels of performance are evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article provides a review of techniques for the analysis of survival data arising from respiratory health studies. Popular techniques such as the Kaplan–Meier survival plot and the Cox proportional hazards model are presented and illustrated using data from a lung cancer study. Advanced issues are also discussed, including parametric proportional hazards models, accelerated failure time models, time-varying explanatory variables, simultaneous analysis of multiple types of outcome events and the restricted mean survival time, a novel measure of the effect of treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Historical sediment nutrient concentrations and heavy-metal distributions were studied in five embayments in the Gulf of Finland and an adjacent lake. The main objective of the study was to examine the response of these water bodies to temporal changes in human activities. Sediment cores were collected from the sites and dated using 210Pb and 137Cs. The cores were analyzed for total carbon (TC), total nitrogen (TN), total phosphorus (TP), organic phosphorus (OP), inorganic phosphorus (IP), biogenic silica (BSi), loss on ignition (LOI), grain size, Cu, Zn, Al, Fe, Mn, K, Ca, Mg and Na. Principal component analysis (PCA) was used to summarize the trends in the geochemical variables and to compare trends between the different sites. The links between the catchment land use and sediment geochemical data were studied using a multivariate technique of redundancy analysis (RDA). Human activities produce marked geochemical variations in coastal sediments. These variations and signals are often challenging to interpret due to various sedimentological and post-depositional factors affecting the sediment profiles. In general, the sites studied here show significant upcore increases in sedimentation rates, TP and TN concentrations. Also Cu, which is considered to be a good indicator of anthropogenic influence, showed clear increases from 1850 towards the top part of the cores. Based on the RDA-analysis, in the least disturbed embayments with high forest cover, the sediments are dominated by lithogenic indicators Fe, K, Al and Mg. In embayments close to urban settlement, the sediments have high Cu concentrations and a high sediment Fe/Mn ratio. This study suggests that sediment accumulation rates vary significantly from site to site and that the overall sedimentation can be linked to the geomorphology and basin bathymetry, which appear to be the major factors governing sedimentation rates; i.e. a high sediment accumulation rate is not characteristic either to urban or to rural sites. The geochemical trends are strongly site specific and depend on the local geochemical background, basin characteristics and anthropogenic metal and nutrient loading. Of the studied geochemical indicators, OP shows the least monotonic trends in all studied sites. When compared to other available data, OP seems to be the most reliable geochemical indicator describing the trophic development of the study sites, whereas Cu and Zn appear to be good indicators for anthropogenic influence. As sedimentation environments, estuarine and marine sites are more complex than lacustrine basins with multiple sources of sediment input and more energetic conditions in the former. The crucial differences between lacustrine and estuarine/coastal sedimentation environments are mostly related to Fe. P sedimentation is largely governed by Fe redox-reactions in estuarine environments. In freshwaters, presence of Fe is clearly linked to the sedimentation of other lithogenic metals, and therefore P sedimentation and preservation has a more direct linkage to organic matter sedimentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retinol-binding protein and prealbumin were isolated from duck plasma by chromatography on DEAE-cellulose-and DEAE-Sephadex A-50, gel filtration on Sephadex G- 100 and preparative Polyacrylamide gel electrophoresis. The molecular weights of the retinolbinding protein-prealbumin complex, prealbumin and retinol-binding protein were found to be 75,000, 55,0000 and 20,000, respectively. On sodium dodecyl sulphate Polyacrylamide gel electrophoresis, prealbumin dissociated into identical subunits exhibiting a molecular weight of 13,500. Retinol-binding protein exhibited microheterogeneity on electrophoresis, whereas prealbumin moved as a single band unlike the multiple bands observed in chicken and rat.The ultraviolet and fluorescence spectra of the two proteins were similar to those isolated from other species. No carbohydrate moiety was detected in either retinol-binding protein or prealbumin. Duck retinol-binding protein and prealbumin showed cross-reactivity with their counterparts in chicken but differed immunologically from those of goat and man. Retinolbinding protein and prealbumin could be dissociated at low ionic strength, in 2M urea, by CMsephadex chromatography or on preparative electrophoresis. Although the transport of retinol in duck plasma is mediated by carrier proteins as in other species, it is distinguished by the absence of microheterogeneity in prealbumin and of an apo-retinol-binding protein form that could be transported in the plasma.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coccidiosis is a costly enteric disease of chickens caused by protozoan parasites of the genus Eimeria. Disease diagnosis and management is complicated since there are multiple Eimeria species infecting chickens and mixed species infections are common. Current control measures are only partially effective and this, combined with concerns over vaccine efficacy and increasing drug resistance, demonstrates a need for improved coccidiosis diagnosis and control. Before improvements can be made, it is important to understand the species commonly infecting poultry flocks in both backyard and commercial enterprises. The aim of this project was to conduct a survey and assessment of poultry Eimeria across Australia using genetic markers, and create a collection of isolates for each Eimeria species. A total of 260 samples (faecal or caecal) was obtained, and survey results showed that Eimeria taxa were present in 98% of commercial and 81% of backyard flocks. The distribution of each Eimeria species was widespread across Australia, with representatives of all species being found in every state and territory, and the Eimeria species predominating in commercial flocks differed from those in backyard flocks. Three operational taxonomic units also occurred frequently in commercial flocks highlighting the need to understand the impact of these uncharacterised species on poultry production. As Eimeria infections were also frequent in backyard flocks, there is a potential for backyard flocks to act as reservoirs for disease, especially as the industry moves towards free range production systems. This Eimeria collection will be an important genetic resource which is the crucial first step in the development of more sophisticated diagnostic tools and the development of new live vaccines which ultimately will provide savings to the industry in terms of more efficient coccidiosis management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Mycobacterium leprae genome has less than 50% coding capacity and 1,133 pseudogenes. Preliminary evidence suggests that some pseudogenes are expressed. Therefore, defining pseudogene transcriptional and translational potentials of this genome should increase our understanding of their impact on M. leprae physiology. Results: Gene expression analysis identified transcripts from 49% of all M. leprae genes including 57% of all ORFs and 43% of all pseudogenes in the genome. Transcribed pseudogenes were randomly distributed throughout the chromosome. Factors resulting in pseudogene transcription included: 1) co-orientation of transcribed pseudogenes with transcribed ORFs within or exclusive of operon-like structures; 2) the paucity of intrinsic stem-loop transcriptional terminators between transcribed ORFs and downstream pseudogenes; and 3) predicted pseudogene promoters. Mechanisms for translational ``silencing'' of pseudogene transcripts included the lack of both translational start codons and strong Shine-Dalgarno (SD) sequences. Transcribed pseudogenes also contained multiple ``in-frame'' stop codons and high Ka/Ks ratios, compared to that of homologs in M. tuberculosis and ORFs in M. leprae. A pseudogene transcript containing an active promoter, strong SD site, a start codon, but containing two in frame stop codons yielded a protein product when expressed in E. coli. Conclusion: Approximately half of M. leprae's transcriptome consists of inactive gene products consuming energy and resources without potential benefit to M. leprae. Presently it is unclear what additional detrimental affect(s) this large number of inactive mRNAs has on the functional capability of this organism. Translation of these pseudogenes may play an important role in overall energy consumption and resultant pathophysiological characteristics of M. leprae. However, this study also demonstrated that multiple translational ``silencing'' mechanisms are present, reducing additional energy and resource expenditure required for protein production from the vast majority of these transcripts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.