922 resultados para Minimal Hausdor Frames
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
Purpose: To determine whether neuroretinal function differs in healthy persons with and without common risk gene variants for age- related macular degeneration (AMD) and no ophthalmoscopic signs of AMD, and to compare those findings in persons with manifest early AMD. Methods and Participants: Neuroretinal function was assessed with the multifocal electroretinogram (mfERG) (VERIS, Redwood City, CA,) in 32 participants (22 healthy persons with no clinical signs of AMD and 10 early AMD patients). The 22 healthy participants with no AMD were risk genotypes for either the CFH (rs380390) and/or ARMS2 (rs10490920). We used a slow flash mfERG paradigm (3 inserted frames) and a 103 hexagon stimulus array. Recordings were made with DTL electrodes; fixation and eye movements were monitored online. Trough N1 to peak P1 (N1P1) response densities and P1-implicit times (IT) were analysed in 5 concentric rings. Results: N1P1 response densities (mean ± SD) for concentric rings 1-3 were on average significantly higher in at-risk genotypes (ring 1: 17.97 nV/deg2 ± 1.9, ring 2: 11.7 nV/deg2 ±1.3, ring 3: 8.7 nV/deg2 ± 0.7) compared to those without risk (ring 1: 13.7 nV/deg2 ± 1.9, ring 2: 9.2 nV/deg2 ±0.8, ring 3: 7.3 nV/deg2 ± 1.1) and compared to persons with early AMD (ring 1: 15.3 nV/deg2 ± 4.8, ring 2: 9.1 nV/deg2 ±2.3, ring 3 nV/deg2: 7.3± 1.3) (p<0.5). The group implicit times, P1-ITs for ring 1 were on average delayed in the early AMD patients (36.4 ms ± 1.0) compared to healthy participants with (35.1 ms ± 1.1) or without risk genotypes (34.8 ms ±1.3), although these differences were not significant. Conclusion: Neuroretinal function in persons with normal fundi can be differentiated into subgroups based on their genetics. Increased neuroretinal activity in persons who carry AMD risk genotypes may be due to genetically determined subclinical inflammatory and/or histological changes in the retina. Assessment of neuroretinal function in healthy persons genetically susceptible to AMD may be a useful early biomarker before there is clinical manifestation of AMD.
Resumo:
High magnification and large depth of field with a temporal resolution of less than 100 microseconds are possible using the present invention which combines a linear electron beam produced by a tungsten filament from an SX-40A Scanning Electron Microscope (SEM), a magnetic deflection coil with lower inductance resulting from reducing the number of turns of the saddle-coil wires, while increasing the diameter of the wires, a fast scintillator, photomultiplier tube, photomultiplier tube base, and signal amplifiers and a high speed data acquisition system which allows for a scan rate of 381 frames per second and 256.times.128 pixel density in the SEM image at a data acquisition rate of 25 MHz. The data acquisition and scan position are fully coordinated. A digitizer and a digital waveform generator which generates the sweep signals to the scan coils run off the same clock to acquire the signal in real-time.
Resumo:
Background: Recommendations for the introduction of solids and fluids to an infant’s diet have changed over the past decade. Since these changes, there has been minimal research to determine patterns in the introduction of foods and fluids to infants. Methods: This retrospective cohort study surveyed mothers who birthed in Queensland, Australia, from February 1 to May 31, 2010, around 4 months postpartum. Frequencies of foods and fluids given to infants at 4, 8, 13, and 17 weeks were described. Logistic regression determined associations between infant feeding practices, the introduction of other foods and fluids at 17 weeks, and sociodemographic characteristics. Results: Response rate was 35.8%. At 17 weeks, 68% of infants were breastfed and 33% exclusively breastfed. Solids and water had been introduced in 8.6% and 35.0% of infants, respectively. The introduction of solids by 17 weeks was associated with younger maternal age and the infant being given water and infant formula at 4 weeks. The infant being given water at 17 weeks was associated with younger maternal age, the infant being given infant formula at 4 weeks, level of education, relative socioeconomic disadvantage, parity, and birth facility. Conclusion: Over the past decade, there has been a significant reduction in the proportion of infants in Australia who have been given solids by 17 weeks. Sociodemographic characteristics and formula feeding practices at 4 weeks were associated with the introduction of solids and water by 17 weeks. Further research should examine these barriers to improve compliance with current infant feeding recommendations.
Resumo:
Supervision in the creative arts is a topic of growing significance since the increase in creative practice PhDs across universities in Australasia. This presentation will provide context of existing discussions in creative practice and supervision. Creative practice – encompassing practice-based or practice-led research – has now a rich history of research surrounding it. Although it is a comparatively new area of knowledge, great advances have been made in terms of how practice can influence, generate, and become research. The practice of supervision is also a topic of interest, perhaps unsurprisingly considering its necessity within the university environment. Many scholars have written much about supervision practices and the importance of the supervisory role, both in academic and more informal forms. However, there is an obvious space in between: there is very little research on supervision practices within creative practice higher degrees, especially at PhD or doctorate level. Despite the existence of creative practice PhD programs, and thus the inherent necessity for successful supervisors, there remain minimal publications and limited resources available. Creative Intersections explores the existing publications and resources, and illustrates that a space for new published knowledge and tools exists.
Resumo:
This paper considers the problem of reconstructing the motion of a 3D articulated tree from 2D point correspondences subject to some temporal prior. Hitherto, smooth motion has been encouraged using a trajectory basis, yielding a hard combinatorial problem with time complexity growing exponentially in the number of frames. Branch and bound strategies have previously attempted to curb this complexity whilst maintaining global optimality. However, they provide no guarantee of being more efficient than exhaustive search. Inspired by recent work which reconstructs general trajectories using compact high-pass filters, we develop a dynamic programming approach which scales linearly in the number of frames, leveraging the intrinsically local nature of filter interactions. Extension to affine projection enables reconstruction without estimating cameras.
Resumo:
This paper presents an adaptive metering algorithm for enhancing the electronic screening (e-screening) operation at truck weight stations. This algorithm uses a feedback control mechanism to control the level of truck vehicles entering the weight station. The basic operation of the algorithm allows more trucks to be inspected when the weight station is underutilized by adjusting the weight threshold lower. Alternatively, the algorithm restricts the number of trucks to inspect when the station is overutilized to prevent queue spillover. The proposed control concept is demonstrated and evaluated in a simulation environment. The simulation results demonstrate the considerable benefits of the proposed algorithm in improving overweight enforcement with minimal negative impacts on nonoverweighed trucks. The test results also reveal that the effectiveness of the algorithm improves with higher truck participation rates in the e-screening program.
Resumo:
Purpose: The measurement of broadband ultrasonic attenuation (BUA) in cancellous bone for the assessment of osteoporosis follows a parabolic-type dependence with bone volume fraction; having minima values corresponding to both entire bone and entire marrow. Langton has recently proposed that the primary BUA mechanism may be significant phase interference due to variations in propagation transit time through the test sample as detected over the phase-sensitive surface of the receive ultrasound transducer. This fundamentally simple concept assumes that the propagation of ultrasound through a complex solid : liquid composite sample such as cancellous bone may be considered by an array of parallel ‘sonic rays’. The transit time of each ray is defined by the proportion of bone and marrow propagated, being a minimum (tmin) solely through bone and a maximum (tmax) solely through marrow. A Transit Time Spectrum (TTS), ranging from tmin to tmax, may be defined describing the proportion of sonic rays having a particular transit time, effectively describing lateral inhomogeneity of transit time over the surface of the receive ultrasound transducer. Phase interference may result from interaction of ‘sonic rays’ of differing transit times. The aim of this study was to test the hypothesis that there is a dependence of phase interference upon the lateral inhomogenity of transit time by comparing experimental measurements and computer simulation predictions of ultrasound propagation through a range of relatively simplistic solid:liquid models exhibiting a range of lateral inhomogeneities. Methods: A range of test models was manufactured using acrylic and water as surrogates for bone and marrow respectively. The models varied in thickness in one dimension normal to the direction of propagation, hence exhibiting a range of transit time lateral inhomogeneities, ranging from minimal (single transit time) to maximal (wedge; ultimately the limiting case where each sonic ray has a unique transit time). For the experimental component of the study, two unfocused 1 MHz ¾” broadband diameter transducers were utilized in transmission mode; ultrasound signals were recorded for each of the models. The computer simulation was performed with Matlab, where the transit time and relative amplitude of each sonic ray was calculated. The transit time for each sonic ray was defined as the sum of transit times through acrylic and water components. The relative amplitude considered the reception area for each sonic ray along with absorption in the acrylic. To replicate phase-sensitive detection, all sonic rays were summed and the output signal plotted in comparison with the experimentally derived output signal. Results: From qualtitative and quantitative comparison of the experimental and computer simulation results, there is an extremely high degree of agreement of 94.2% to 99.0% between the two approaches, supporting the concept that propagation of an ultrasound wave, for the models considered, may be approximated by a parallel sonic ray model where the transit time of each ray is defined by the proportion of ‘bone’ and ‘marrow’. Conclusions: This combined experimental and computer simulation study has successfully demonstrated that lateral inhomogeneity of transit time has significant potential for phase interference to occur if a phase-sensitive ultrasound receive transducer is implemented as in most commercial ultrasound bone analysis devices.
Resumo:
An environmentally sustainable and thus green business process is one that delivers organizational value whilst also exerting a minimal impact on the natural environment. Recent works from the field of Information Systems (IS) have argued that information systems can contribute to the design and implementation of sustainable business processes. While prior research has investigated how information systems can be used in order to support sustainable business practices, there is still a void as to the actual changes that business processes have to undergo in order to become environmentally sustainable, and the specific role that information systems play in enabling this change. In this paper, we provide a conceptualization of environmentally sustainable business processes, and discuss the role of functional affordances of information systems in enabling both incremental and radical changes in order to make processes environmentally sustainable. Our conceptualization is based on (a) a fundamental definition of the concept of environmental sustainability, grounded in two basic components:the environmental source and sink functions of any project or activity, and (b) the concept of functional affordances, which describe the potential uses originating in the material properties of information systems in relation to their use context. In order to illustrate the application of our framework and provide a first evaluation, we analyse two examples from prior research where information systems impacted on the sustainability of business processes.
Resumo:
We support Shane and Venkataraman’s (2000) basic idea of an “entrepreneurship nexus” where characteristics of the actor as well as those of the “opportunity” they work on influence action and outcomes in the creation of new economic activities. However, a review of the literature reveals that minimal progress has been made on the core issues pertaining to the nexus idea. We argue that this is rooted in fundamental and insurmountable problems with the “opportunity” construct itself, and demonstrate the state of confusion in the literature caused by inconsistent use of the construct within and across works and authors. As an alternative, we suggest the admittedly subjective notion of New Venture as a more workable alternative. We provide a comprehensive definition and explanation of this construct, and take steps towards improved conceptualization and operationalization of its subdimensions. With some further work on these conceptualizations and operationalizations it will be possible to implement a comprehensive research program that can finally deliver on the promise outlined by Shane and Venkataraman (2000).
Resumo:
Aim To better understand the morphology of, and the effect of different travel patterns and varying substrate environments on, the feral horse foot to better manage the feet of domestic horses. Methods The left forefeet of 20 adult feral horses from each of five geographically separated populations in Australia (n = 100) were investigated. Populations were selected on the basis of substrate hardness under foot and the amount of travel typical for the population. Feet were radiographed and photographed and 40 morphometric measurements of each foot were obtained. Results Of the 40 parameters, 37 differed significantly (P < 0.05) among the populations, which suggested that substrate hardness and travel distance have an effect on foot morphology. Harder substrates and longer travel distances were associated with short hoof walls and minimal hoof wall flaring. Softer substrates and moderate travel distances were associated with long flared walls, similar to that of typical untrimmed feet of domestic horses. Conclusions The morphology of the feral horse foot appeared to be affected by the distance travelled and by the abrasive qualities and mechanical properties of the substrate under foot. There were marked differences in some conformation parameters between the feral horses in the current study and domestic horses in previous studies. Although the conformation of the feral horse foot may have some prescriptive value, concerns regarding abnormal foot anatomy warrant further investigation.
Resumo:
Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.
Resumo:
Biological validation of new radiotherapy modalities is essential to understand their therapeutic potential. Antiprotons have been proposed for cancer therapy due to enhanced dose deposition provided by antiproton-nucleon annihilation. We assessed cellular DNA damage and relative biological effectiveness (RBE) of a clinically relevant antiproton beam. Despite a modest LET (~19 keV/μm), antiproton spread out Bragg peak (SOBP) irradiation caused significant residual γ-H2AX foci compared to X-ray, proton and antiproton plateau irradiation. RBE of ~1.48 in the SOBP and ~1 in the plateau were measured and used for a qualitative effective dose curve comparison with proton and carbon-ions. Foci in the antiproton SOBP were larger and more structured compared to X-rays, protons and carbon-ions. This is likely due to overlapping particle tracks near the annihilation vertex, creating spatially correlated DNA lesions. No biological effects were observed at 28–42 mm away from the primary beam suggesting minimal risk from long-range secondary particles.
Resumo:
Physical and chemical properties of biofuel are influenced by structural features of fatty acid such as chain length, degree of unsaturation and branching of the chain. A simple and reliable calculation method to estimate fuel property is therefore needed to avoid experimental testing which is difficult, costly and time consuming. Typically in commercial biodiesel production such testing is done for every batch of fuel produced. In this study 9 different algae species were selected that were likely to be suitable for subtropical climates. The fatty acid methyl esters (FAMEs) of all algae species were analysed and the fuel properties like cetane number (CN), cold filter plugging point (CFPP), kinematic viscosity (KV), density and higher heating value (HHV) were determined. The relation of each fatty acid with particular fuel property is analysed using multivariate and multi-criteria decision method (MCDM) software. They showed that some fatty acids have major influences on the fuel properties whereas others have minimal influence. Based on the fuel properties and amounts of lipid content rank order is drawn by PROMETHEE-GAIA which helped to select the best algae species for biodiesel production in subtropical climates. Three species had fatty acid profiles that gave the best fuel properties although only one of these (Nannochloropsis oculata) is considered the best choice because of its higher lipid content.
Resumo:
We investigated the effects of an Ironman triathlon race on markers of muscle damage, inflammation and heat shock protein 70 (HSP70). Nine well-trained male triathletes (mean +/- SD age 34 +/- 5 years; VO(2peak) 66.4 ml kg(-1) min(-1)) participated in the 2004 Western Australia Ironman triathlon race (3.8 km swim, 180 km cycle, 42.2 km run). We assessed jump height, muscle strength and soreness, and collected venous blood samples 2 days before the race, within 30 min and 14-20 h after the race. Plasma samples were analysed for muscle proteins, acute phase proteins, cytokines, heat shock protein 70 (HSP70), and clinical biochemical variables related to dehydration, haemolysis, liver and renal functions. Muscular strength and jump height decreased significantly (P < 0.05) after the race, whereas muscle soreness and the plasma concentrations of muscle proteins increased. The cytokines interleukin (IL)-1 receptor antagonist, IL-6 and IL-10, and HSP70 increased markedly after the race, while IL-12p40 and granulocyte colony-stimulating factor (G-CSF) were also elevated. IL-4, IL-1beta and tumour necrosis factor-alpha did not change significantly, despite elevated C-reactive protein and serum amyloid protein A on the day after the race. Plasma creatinine, uric acid and total bilirubin concentrations and gamma-glutamyl transferase activity also changed after the race. In conclusion, despite evidence of muscle damage and an acute phase response after the race, the pro-inflammatory cytokine response was minimal and anti-inflammatory cytokines were induced. HSP70 is released into the circulation as a function of exercise duration.