931 resultados para Analyses de trajectoires non-paramétriques
Resumo:
Internet services are important part of daily activities for most of us. These services come with sophisticated authentication requirements which may not be handled by average Internet users. The management of secure passwords for example creates an extra overhead which is often neglected due to usability reasons. Furthermore, password-based approaches are applicable only for initial logins and do not protect against unlocked workstation attacks. In this paper, we provide a non-intrusive identity verification scheme based on behavior biometrics where keystroke dynamics based-on free-text is used continuously for verifying the identity of a user in real-time. We improved existing keystroke dynamics based verification schemes in four aspects. First, we improve the scalability where we use a constant number of users instead of whole user space to verify the identity of target user. Second, we provide an adaptive user model which enables our solution to take the change of user behavior into consideration in verification decision. Next, we identify a new distance measure which enables us to verify identity of a user with shorter text. Fourth, we decrease the number of false results. Our solution is evaluated on a data set which we have collected from users while they were interacting with their mail-boxes during their daily activities.
Resumo:
Detailed mineralogical studies of the matrix and fracture-fill materials of a large number of samples from the Rustler Formation have been carried out using x-ray diffraction, high-resolution transmission electron microscopy, electron microprobe analysis, x-ray fluorescence, and atomic absorption spectrophotometry. These analyses indicate the presence of four clay minerals: interstratified chlorite/saponite, illite, chlorite, and serpentine. Corrensite (regularly stratified chlorite/saponite) is the dominant clay mineral in samples from the Culebra dolomite and two shale layers of the lower unnamed member of the Rustler Formation. Within other layers of the Rustler Formation, disordered mixed chlorite/saponite is usually the most abundant clay mineral. Studies of the morphology and composition of clay crystallites suggest that the corrensite was formed by the alteration of detrital dioctahedral smectite in magnesium-rich pore fluids during early diagenesis of the Rustler Formation. This study provides initial estimates of the abundance and nature of the clay minerals in the Culebra dolomite in the vicinity of the Waste Isolation Pilot Plant.
Resumo:
There are different ways to authenticate humans, which is an essential prerequisite for access control. The authentication process can be subdivided into three categories that rely on something someone i) knows (e.g. password), and/or ii) has (e.g. smart card), and/or iii) is (biometric features). Besides classical attacks on password solutions and the risk that identity-related objects can be stolen, traditional biometric solutions have their own disadvantages such as the requirement of expensive devices, risk of stolen bio-templates etc. Moreover, existing approaches provide the authentication process usually performed only once initially. Non-intrusive and continuous monitoring of user activities emerges as promising solution in hardening authentication process: iii-2) how so. behaves. In recent years various keystroke dynamic behavior-based approaches were published that are able to authenticate humans based on their typing behavior. The majority focuses on so-called static text approaches, where users are requested to type a previously defined text. Relatively few techniques are based on free text approaches that allow a transparent monitoring of user activities and provide continuous verification. Unfortunately only few solutions are deployable in application environments under realistic conditions. Unsolved problems are for instance scalability problems, high response times and error rates. The aim of this work is the development of behavioral-based verification solutions. Our main requirement is to deploy these solutions under realistic conditions within existing environments in order to enable a transparent and free text based continuous verification of active users with low error rates and response times.
Resumo:
Historical information can be used, in addition to pedigree, traits and genotypes, to map quantitative trait locus (QTL) in general populations via maximum likelihood estimation of variance components. This analysis is known as linkage disequilibrium (LD) and linkage mapping, because it exploits both linkage in families and LD at the population level. The search for QTL in the wild population of Soay sheep on St. Kilda is a proof of principle. We analysed the data from a previous study and confirmed some of the QTLs reported. The most striking result was the confirmation of a QTL affecting birth weight that had been reported using association tests but not when using linkage-based analyses. Copyright © Cambridge University Press 2010.
Resumo:
Motivation: Unravelling the genetic architecture of complex traits requires large amounts of data, sophisticated models and large computational resources. The lack of user-friendly software incorporating all these requisites is delaying progress in the analysis of complex traits. Methods: Linkage disequilibrium and linkage analysis (LDLA) is a high-resolution gene mapping approach based on sophisticated mixed linear models, applicable to any population structure. LDLA can use population history information in addition to pedigree and molecular markers to decompose traits into genetic components. Analyses are distributed in parallel over a large public grid of computers in the UK. Results: We have proven the performance of LDLA with analyses of simulated data. There are real gains in statistical power to detect quantitative trait loci when using historical information compared with traditional linkage analysis. Moreover, the use of a grid of computers significantly increases computational speed, hence allowing analyses that would have been prohibitive on a single computer. © The Author 2009. Published by Oxford University Press. All rights reserved.
Resumo:
Past work has clearly demonstrated that numerous commonly used metallic materials will support burning in oxygen, especially at higher pressures. An approach to rectify this significant safety problem has been successfully developed and implemented by applying the concept of Situational Non-Flammability. This approach essentially removes or breaks one leg of the conceptual fire triangle, a tool commonly used to define the three things that are required to support burning; a fuel, an ignition source and an oxidizer. Since an oxidiser is always present in an oxygen system as are ignition sources, the concept of Situational Non-Flammability essentially removes the fuel leg of the fire triangle by only utilising materials that will not burn at the maximum pressure, for example, that the control valve is to be used in. The utilisation of this approach has lead to the development of a range of oxygen components that are practically unable to burn while in service at their design pressure thus providing an unparalleled level of first safety while not compromising on the performance or endurance required in the function of these components. This paper describes the concept of Situational Non-Flammability, how it was used to theoretically evaluate designs of components for oxygen service and the outcomes of the actual development, fabrication and finally utilisation of these components in real oxygen systems in a range of flow control devices.
Resumo:
An analytical method for the detection of carbonaceous gases by a non-dispersive infrared sensor (NDIR) has been developed. The calibration plots of six carbonaceous gases including CO2, CH4, CO, C2H2, C2H4 and C2H6 were obtained and the reproducibility determined to verify the feasibility of this gas monitoring method. The results prove that squared correlation coefficients for the six gas measurements are greater than 0.999. The reproducibility is excellent, thus indicating that this analytical method is useful to determinate the concentrations of carbonaceous gases.
Resumo:
The decision in the New South Wales Supreme Court in Boyce v McIntyre [2008] NSWSC 1218 involved determination of a number of issues relating to an assessment of costs under the Legal Profession Act 2004 (NSW). The issue of broad significance was whether a non-associated third party payer must pay the fixed fee that was agreed between the law practice and the client. The court found that the client agreement did not form the basis of assessing costs for the non-associated third party payer.
Resumo:
In Deppro Pty Ltd v Hannah [2008] QSC 193 one of the matters considered by the court related to the requirement in r 243 of the Uniform Civil Procedure Rules 1999 (Qld) that a notice of non-party disclosure must “state the allegation in issue in the pleadings about which the document sought is directly relevant.”The approach adopted by the issuing party in this case of asserting that documents sought by a notice of non-party disclosure are relevant to allegations in numbered paragraphs in pleadings, and serving copies of the pleadings with the notice, is not uncommon in practice. This decision makes it clear that this practice is fraught with danger. In circumstances where it is not apparent that the non-party has been fully apprised of the relevant issues the decision suggests an applicant for non-party disclosure who has not complied with the requirements of s 243 might be required to issue a fresh, fully compliant notice, and to suffer associated costs consequences.
Resumo:
Trajectory basis Non-Rigid Structure From Motion (NRSFM) currently faces two problems: the limit of reconstructability and the need to tune the basis size for different sequences. This paper provides a novel theoretical bound on 3D reconstruction error, arguing that the existing definition of reconstructability is fundamentally flawed in that it fails to consider system condition. This insight motivates a novel strategy whereby the trajectory's response to a set of high-pass filters is minimised. The new approach eliminates the need to tune the basis size and is more efficient for long sequences. Additionally, the truncated DCT basis is shown to have a dual interpretation as a high-pass filter. The success of trajectory filter reconstruction is demonstrated quantitatively on synthetic projections of real motion capture sequences and qualitatively on real image sequences.
Resumo:
In this paper two studies are reported which compare (a) the perceptions of family functioning held by clinic and non-clinic adolescents, and (b) the perceptions of family functioning held by adolescents and their mothers in clinic and non-clinic families. In Study 1, matched group of clinic and non-clinic adolescents were compared on their responses to a 30-item scale (ICPS) designed to measure three factors of family functioning: Intimacy (high vs. low), Parenting Style (democratic vs. controlled) and Conflict (high vs. low). Clinic and non-clinic adolescents were also compared on their responses to a multi-dimensional measure of adolescent self-concept. Although there was little difference between the two groups of adolescents in terms of their perceptions of family functioning, there were strong relationships between the self-concept variables and the family functioning variables. In Study 2, comparisons were made between the perceptions of family functioning held by mothers and adolescents for both clinical and non-clinic families. There were no differences between the two groups of adolescents in terms of their perceptions of family functioning, although there were clear differences between the two groups of mothers. In addition, clinic adolescents and their mothers did not differ in their perceptions of the family, whereas adolescents in the non-clinic group saw their families significantly as less intimate and more conflicted than did their mothers.
Resumo:
Laminar two-dimensional natural convection boundary-layer flow of non-Newtonian fluids along an isothermal horizontal circular cylinder has been studied using a modified power-law viscosity model. In this model, there are no unrealistic limits of zero or infinite viscosity. Therefore, the boundary-layer equations can be solved numerically by using marching order implicit finite difference method with double sweep technique. Numerical results are presented for the case of shear-thinning as well as shear thickening fluids in terms of the fluid velocity and temperature distributions, shear stresses and rate of heat transfer in terms of the local skin-friction and local Nusselt number respectively.
Resumo:
Historically a significant gap between male and female wages has existed in the Australian labour market. Indeed this wage differential was institutionalised in the 1912 arbitration decision which determined that the basic female wage would be set at between 54 and 66 per cent of the male wage. More recently however, the 1969 and 1972 Equal Pay Cases determined that male/female wage relativities should be based upon the premise of equal pay for work of equal value. It is important to note that the mere observation that average wages differ between males and females is not sine qua non evidence of sex discrimination. Economists restrict the definition of wage discrimination to cases where two distinct groups receive different average remuneration for reasons unrelated to differences in productivity characteristics. This paper extends previous studies of wage discrimination in Australia (Chapman and Mulvey, 1986; Haig, 1982) by correcting the estimated male/female wage differential for the existence of non-random sampling. Previous Australian estimates of male/female human capital basedwage specifications together with estimates of the corresponding wage differential all suffer from a failure to address this issue. If the sample of females observed to be working does not represent a random sample then the estimates of the male/female wage differential will be both biased and inconsistent.
Resumo:
Biological systems exhibit a wide range of contextual effects, and this often makes it difficult to construct valid mathematical models of their behaviour. In particular, mathematical paradigms built upon the successes of Newtonian physics make assumptions about the nature of biological systems that are unlikely to hold true. After discussing two of the key assumptions underlying the Newtonian paradigm, we discuss two key aspects of the formalism that extended it, Quantum Theory (QT). We draw attention to the similarities between biological and quantum systems, motivating the development of a similar formalism that can be applied to the modelling of biological processes.
Resumo:
Long term exposure to vehicle emissions has been associated with harmful health effects. Children are amongst the most susceptible group and schools represent an environment where they can experience significant exposure to vehicle emissions. However, there are limited studies on children’s exposure to vehicle emissions in schools. The aim of this study was to quantify the concentration of organic aerosol and in particular, vehicle emissions that children are exposed to during school hours. Therefore an Aerodyne compact time-of-flight aerosol mass spectrometer (TOF-AMS) was deployed at five urban schools in Brisbane, Australia. The TOF-AMS enabled the chemical composition of the non- refractory (NR-PM1) to be analysed with a high temporal resolution to assess the concentration of vehicle emissions and other organic aerosols during school hours. At each school the organic fraction comprised the majority of NR-PM1 with secondary organic aerosols as the main constitute. At two of the schools, a significant source of the organic aerosol (OA) was slightly aged vehicle emissions from nearby highways. More aged and oxidised OA was observed at the other three schools, which also recorded strong biomass burning influences. Primary emissions were found to dominate the OA at only one school which had an O:C ratio of 0.17, due to fuel powered gardening equipment used near the TOF-AMS. The diurnal cycle of OA concentration varied between schools and was found to be at a minimum during school hours. The major organic component that school children were exposed to during school hours was secondary OA. Peak exposure of school children to HOA occurred during school drop off and pick up times. Unless a school is located near major roads, children are exposed predominately to regional secondary OA as opposed to local emissions during schools hours in urban environments.