934 resultados para Single-step
Resumo:
The actin microfilament plays a critical role in many cellular processes including embryonic development, wound healing, immune response, and tissue development. It is commonly organized in the form of networks whose mechanical properties change with changes in their architecture due to cell evolution processes. This paper presents a new nonlinear continuum mechanics model of single filamentous actin (F-actin) that is based on nanoscale molecular simulations. Following this continuum model of the single F-actin, mechanical properties of differently architected lamellipodia are studied. The results provide insight that can contribute to the understanding of the cell edge motions of living cells.
Resumo:
Driven by the rapid development of ubiquitous and pervasive computing, personalized services and applications are deployed to support our lives. Accordingly, the number of interfaces and devices (smartphone, tablet computer, etc.) provided to access and consume these services is growing continuously. To simplify the complexity of managing many accounts with different credentials, Single Sign-On (SSO) solutions have been introduced. However, a single password for many accounts represents a single-point-of-failure. Furthermore, once initiated SSO session is a high potential risk when the working station is left unlocked and unattended. In this paper, we present a conception of a Persistent Single Sign-On (PSSO) for ubiquitous home environments by involving the capabilities of Behavioral Biometrics to check the identity of the user continuously in an unobtrusive manner.
Resumo:
In the modern connected world, pervasive computing has become reality. Thanks to the ubiquity of mobile computing devices and emerging cloud-based services, the users permanently stay connected to their data. This introduces a slew of new security challenges, including the problem of multi-device key management and single-sign-on architectures. One solution to this problem is the utilization of secure side-channels for authentication, including the visual channel as vicinity proof. However, existing approaches often assume confidentiality of the visual channel, or provide only insufficient means of mitigating a man-in-the-middle attack. In this work, we introduce QR-Auth, a two-step, 2D barcode based authentication scheme for mobile devices which aims specifically at key management and key sharing across devices in a pervasive environment. It requires minimal user interaction and therefore provides better usability than most existing schemes, without compromising its security. We show how our approach fits in existing authorization delegation and one-time-password generation schemes, and that it is resilient to man-in-the-middle attacks.
Resumo:
For the timber industry, the ability to simulate the drying of wood is invaluable for manufacturing high quality wood products. Mathematically, however, modelling the drying of a wet porous material, such as wood, is a diffcult task due to its heterogeneous and anisotropic nature, and the complex geometry of the underlying pore structure. The well{ developed macroscopic modelling approach involves writing down classical conservation equations at a length scale where physical quantities (e.g., porosity) can be interpreted as averaged values over a small volume (typically containing hundreds or thousands of pores). This averaging procedure produces balance equations that resemble those of a continuum with the exception that effective coeffcients appear in their deffnitions. Exponential integrators are numerical schemes for initial value problems involving a system of ordinary differential equations. These methods differ from popular Newton{Krylov implicit methods (i.e., those based on the backward differentiation formulae (BDF)) in that they do not require the solution of a system of nonlinear equations at each time step but rather they require computation of matrix{vector products involving the exponential of the Jacobian matrix. Although originally appearing in the 1960s, exponential integrators have recently experienced a resurgence in interest due to a greater undertaking of research in Krylov subspace methods for matrix function approximation. One of the simplest examples of an exponential integrator is the exponential Euler method (EEM), which requires, at each time step, approximation of φ(A)b, where φ(z) = (ez - 1)/z, A E Rnxn and b E Rn. For drying in porous media, the most comprehensive macroscopic formulation is TransPore [Perre and Turner, Chem. Eng. J., 86: 117-131, 2002], which features three coupled, nonlinear partial differential equations. The focus of the first part of this thesis is the use of the exponential Euler method (EEM) for performing the time integration of the macroscopic set of equations featured in TransPore. In particular, a new variable{ stepsize algorithm for EEM is presented within a Krylov subspace framework, which allows control of the error during the integration process. The performance of the new algorithm highlights the great potential of exponential integrators not only for drying applications but across all disciplines of transport phenomena. For example, when applied to well{ known benchmark problems involving single{phase liquid ow in heterogeneous soils, the proposed algorithm requires half the number of function evaluations than that required for an equivalent (sophisticated) Newton{Krylov BDF implementation. Furthermore for all drying configurations tested, the new algorithm always produces, in less computational time, a solution of higher accuracy than the existing backward Euler module featured in TransPore. Some new results relating to Krylov subspace approximation of '(A)b are also developed in this thesis. Most notably, an alternative derivation of the approximation error estimate of Hochbruck, Lubich and Selhofer [SIAM J. Sci. Comput., 19(5): 1552{1574, 1998] is provided, which reveals why it performs well in the error control procedure. Two of the main drawbacks of the macroscopic approach outlined above include the effective coefficients must be supplied to the model, and it fails for some drying configurations, where typical dual{scale mechanisms occur. In the second part of this thesis, a new dual{scale approach for simulating wood drying is proposed that couples the porous medium (macroscale) with the underlying pore structure (microscale). The proposed model is applied to the convective drying of softwood at low temperatures and is valid in the so{called hygroscopic range, where hygroscopically held liquid water is present in the solid phase and water exits only as vapour in the pores. Coupling between scales is achieved by imposing the macroscopic gradient on the microscopic field using suitably defined periodic boundary conditions, which allows the macroscopic ux to be defined as an average of the microscopic ux over the unit cell. This formulation provides a first step for moving from the macroscopic formulation featured in TransPore to a comprehensive dual{scale formulation capable of addressing any drying configuration. Simulation results reported for a sample of spruce highlight the potential and flexibility of the new dual{scale approach. In particular, for a given unit cell configuration it is not necessary to supply the effective coefficients prior to each simulation.
Resumo:
Bridges are currently rated individually for maintenance and repair action according to the structural conditions of their elements. Dealing with thousands of bridges and the many factors that cause deterioration, makes this rating process extremely complicated. The current simplified but practical methods are not accurate enough. On the other hand, the sophisticated, more accurate methods are only used for a single or particular bridge type. It is therefore necessary to develop a practical and accurate rating system for a network of bridges. The first most important step in achieving this aim is to classify bridges based on the differences in nature and the unique characteristics of the critical factors and the relationship between them, for a network of bridges. Critical factors and vulnerable elements will be identified and placed in different categories. This classification method will be used to develop a new practical rating method for a network of railway bridges based on criticality and vulnerability analysis. This rating system will be more accurate and economical as well as improve the safety and serviceability of railway bridges.
Resumo:
We demonstrated for the first time by ab initio density functional calculation and molecular dynamics simulation that C0.5(BN)0.5 armchair single-walled nanotubes (NT) are gapless semiconductors and can be spontaneously formed via the hybrid connection of graphene/BN Nanoribbons (GNR/BNNR) at room temperature. The direct synthesis of armchair C0.5(BN)0.5 via the hybrid connection of GNR/BNNR is predicted to be both thermodynamically and dynamically stable. Such novel armchair C0.5(BN)0.5 NTs possess enhanced conductance as that observed in GNRs. Additionally, the zigzag C0.5(BN)0.5 SWNTs are narrow band gap semiconductors, which may have potential application for light emission. In light of recent experimental progress and the enhanced degree of control in the synthesis of GNRs and BNNR, our results highlight an interesting avenue for synthesizing a novel specific type of C0.5(BN)0.5 nanotube (gapless or narrow direct gap semiconductor), with potentially important applications in BNC-based nanodevices.
Resumo:
The interaction of bare graphene nanoribbons (GNRs) was investigated by ab initio density functional theory calculations with both the local density approximation (LDA) and the generalized gradient approximation (GGA). Remarkably, two bare 8-GNRs with zigzag-shaped edges are predicted to form an (8, 8) armchair single-wall carbon nanotube (SWCNT) without any obvious activation barrier. The formation of a (10, 0) zigzag SWCNT from two bare 10-GNRs with armchair-shaped edges has activation barriers of 0.23 and 0.61 eV for using the LDA and the revised PBE exchange correlation functional, respectively, Our results suggest a possible route to control the growth of specific types SWCNT via the interaction of GNRs.
Resumo:
Several approaches have been introduced in literature for active noise control (ANC) systems. Since FxLMS algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of FxLMS algorithm. In many ANC applications an online secondary path modelling method using a white noise as a training signal is required to ensure convergence of the system. This paper also proposes a new approach for online secondary path modelling in feedfoward ANC systems. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Benefiting new version of FxLMS algorithm and not continually injection of white noise makes the system more desirable and improves the noise attenuation performance. Comparative simulation results indicate effectiveness of the proposed approach.
Resumo:
Objective: To determine the impact of a free-choice diet on nutritional intake and body condition of feral horses. Animals: Cadavers of 41 feral horses from 5 Australian locations. Procedures: Body condition score (BCS) was determined (scale of 1 to 9), and the stomach was removed from horses during postmortem examination. Stomach contents were analyzed for nutritional variables and macroelement and microelement concentrations. Data were compared among the locations and also compared with recommended daily intakes for horses. Results: Mean BCS varied by location; all horses were judged to be moderately thin. The BCS for males was 1 to 3 points higher than that of females. Amount of protein in the stomach contents varied from 4.3% to 14.9% and was significantly associated with BCS. Amounts of water-soluble carbohydrate and ethanol-soluble carbohydrate in stomach contents of feral horses from all 5 locations were higher than those expected for horses eating high-quality forage. Some macroelement and microelement concentrations were grossly excessive, whereas others were grossly deficient. There was no evidence of ill health among the horses. Conclusions and Clinical Relevance: Results suggested that the diet for several populations of feral horses in Australia appeared less than optimal. However, neither low BCS nor trace mineral deficiency appeared to affect survival of the horses. Additional studies on food sources in these regions, including analysis of water-soluble carbohydrate, ethanol-soluble carbohydrate, and mineral concentrations, are warranted to determine the provenance of such rich sources of nutrients. Determination of the optimal diet for horses may need revision.
Resumo:
Speaker diarization is the process of annotating an input audio with information that attributes temporal regions of the audio signal to their respective sources, which may include both speech and non-speech events. For speech regions, the diarization system also specifies the locations of speaker boundaries and assign relative speaker labels to each homogeneous segment of speech. In short, speaker diarization systems effectively answer the question of ‘who spoke when’. There are several important applications for speaker diarization technology, such as facilitating speaker indexing systems to allow users to directly access the relevant segments of interest within a given audio, and assisting with other downstream processes such as summarizing and parsing. When combined with automatic speech recognition (ASR) systems, the metadata extracted from a speaker diarization system can provide complementary information for ASR transcripts including the location of speaker turns and relative speaker segment labels, making the transcripts more readable. Speaker diarization output can also be used to localize the instances of specific speakers to pool data for model adaptation, which in turn boosts transcription accuracies. Speaker diarization therefore plays an important role as a preliminary step in automatic transcription of audio data. The aim of this work is to improve the usefulness and practicality of speaker diarization technology, through the reduction of diarization error rates. In particular, this research is focused on the segmentation and clustering stages within a diarization system. Although particular emphasis is placed on the broadcast news audio domain and systems developed throughout this work are also trained and tested on broadcast news data, the techniques proposed in this dissertation are also applicable to other domains including telephone conversations and meetings audio. Three main research themes were pursued: heuristic rules for speaker segmentation, modelling uncertainty in speaker model estimates, and modelling uncertainty in eigenvoice speaker modelling. The use of heuristic approaches for the speaker segmentation task was first investigated, with emphasis placed on minimizing missed boundary detections. A set of heuristic rules was proposed, to govern the detection and heuristic selection of candidate speaker segment boundaries. A second pass, using the same heuristic algorithm with a smaller window, was also proposed with the aim of improving detection of boundaries around short speaker segments. Compared to single threshold based methods, the proposed heuristic approach was shown to provide improved segmentation performance, leading to a reduction in the overall diarization error rate. Methods to model the uncertainty in speaker model estimates were developed, to address the difficulties associated with making segmentation and clustering decisions with limited data in the speaker segments. The Bayes factor, derived specifically for multivariate Gaussian speaker modelling, was introduced to account for the uncertainty of the speaker model estimates. The use of the Bayes factor also enabled the incorporation of prior information regarding the audio to aid segmentation and clustering decisions. The idea of modelling uncertainty in speaker model estimates was also extended to the eigenvoice speaker modelling framework for the speaker clustering task. Building on the application of Bayesian approaches to the speaker diarization problem, the proposed approach takes into account the uncertainty associated with the explicit estimation of the speaker factors. The proposed decision criteria, based on Bayesian theory, was shown to generally outperform their non- Bayesian counterparts.
Resumo:
The deformation of rocks is commonly intimately associated with metamorphic reactions. This paper is a step towards understanding the behaviour of fully coupled, deforming, chemically reacting systems by considering a simple example of the problem comprising a single layer system with elastic-power law viscous constitutive behaviour where the deformation is controlled by the diffusion of a single chemical component that is produced during a metamorphic reaction. Analysis of the problem using the principles of non-equilibrium thermodynamics allows the energy dissipated by the chemical reaction-diffusion processes to be coupled with the energy dissipated during deformation of the layers. This leads to strain-rate softening behaviour and the resultant development of localised deformation which in turn nucleates buckles in the layer. All such diffusion processes, in leading to Herring-Nabarro, Coble or “pressure solution” behaviour, are capable of producing mechanical weakening through the development of a “chemical viscosity”, with the potential for instability in the deformation. For geologically realistic strain rates these chemical feed-back instabilities occur at the centimetre to micron scales, and so produce structures at these scales, as opposed to thermal feed-back instabilities that become important at the 100–1000 m scales.
Resumo:
The aim of this work is to develop software that is capable of back projecting primary fluence images obtained from EPID measurements through phantom and patient geometries in order to calculate 3D dose distributions. In the first instance, we aim to develop a tool for pretreatment verification in IMRT. In our approach, a Geant4 application is used to back project primary fluence values from each EPID pixel towards the source. Each beam is considered to be polyenergetic, with a spectrum obtained from Monte Carlo calculations for the LINAC in question. At each step of the ray tracing process, the energy differential fluence is corrected for attenuation and beam divergence. Subsequently, the TERMA is calculated and accumulated to an energy differential 3D TERMA distribution. This distribution is then convolved with monoenergetic point spread kernels, thus generating energy differential 3D dose distributions. The resulting dose distributions are accumulated to yield the total dose distribution, which can then be used for pre-treatment verification of IMRT plans. Preliminary results were obtained for a test EPID image comprised of 100 9 100 pixels of unity fluence. Back projection of this field into a 30 cm9 30 cm 9 30 cm water phantom was performed, with TERMA distributions obtained in approximately 10 min (running on a single core of a 3 GHz processor). Point spread kernels for monoenergetic photons in water were calculated using a separate Geant4 application. Following convolution and summation, the resulting 3D dose distribution produced familiar build-up and penumbral features. In order to validate the dose model we will use EPID images recorded without any attenuating material in the beam for a number of MLC defined square fields. The dose distributions in water will be calculated and compared to TPS predictions.
Resumo:
The double-stranded conformation of cellular DNA is a central aspect of DNA stabilisation and protection. The helix preserves the genetic code against chemical and enzymatic degradation, metabolic activation, and formation of secondary structures. However, there are various instances where single-stranded DNA is exposed, such as during replication or transcription, in the synthesis of chromosome ends, and following DNA damage. In these instances, single-stranded DNA binding proteins are essential for the sequestration and processing of single-stranded DNA. In order to bind single-stranded DNA, these proteins utilise a characteristic and evolutionary conserved single-stranded DNA-binding domain, the oligonucleotide/oligosaccharide-binding (OB)-fold. In the current review we discuss a subset of these proteins involved in the direct maintenance of genomic stability, an important cellular process in the conservation of cellular viability and prevention of malignant transformation. We discuss the central roles of single-stranded DNA binding proteins from the OB-fold domain family in DNA replication, the restart of stalled replication forks, DNA damage repair, cell cycle-checkpoint activation, and telomere maintenance.
Resumo:
Authenticated Encryption (AE) is the cryptographic process of providing simultaneous confidentiality and integrity protection to messages. This approach is more efficient than applying a two-step process of providing confidentiality for a message by encrypting the message, and in a separate pass providing integrity protection by generating a Message Authentication Code (MAC). AE using symmetric ciphers can be provided by either stream ciphers with built in authentication mechanisms or block ciphers using appropriate modes of operation. However, stream ciphers have the potential for higher performance and smaller footprint in hardware and/or software than block ciphers. This property makes stream ciphers suitable for resource constrained environments, where storage and computational power are limited. There have been several recent stream cipher proposals that claim to provide AE. These ciphers can be analysed using existing techniques that consider confidentiality or integrity separately; however currently there is no existing framework for the analysis of AE stream ciphers that analyses these two properties simultaneously. This thesis introduces a novel framework for the analysis of AE using stream cipher algorithms. This thesis analyzes the mechanisms for providing confidentiality and for providing integrity in AE algorithms using stream ciphers. There is a greater emphasis on the analysis of the integrity mechanisms, as there is little in the public literature on this, in the context of authenticated encryption. The thesis has four main contributions as follows. The first contribution is the design of a framework that can be used to classify AE stream ciphers based on three characteristics. The first classification applies Bellare and Namprempre's work on the the order in which encryption and authentication processes take place. The second classification is based on the method used for accumulating the input message (either directly or indirectly) into the into the internal states of the cipher to generate a MAC. The third classification is based on whether the sequence that is used to provide encryption and authentication is generated using a single key and initial vector, or two keys and two initial vectors. The second contribution is the application of an existing algebraic method to analyse the confidentiality algorithms of two AE stream ciphers; namely SSS and ZUC. The algebraic method is based on considering the nonlinear filter (NLF) of these ciphers as a combiner with memory. This method enables us to construct equations for the NLF that relate the (inputs, outputs and memory of the combiner) to the output keystream. We show that both of these ciphers are secure from this type of algebraic attack. We conclude that using a keydependent SBox in the NLF twice, and using two different SBoxes in the NLF of ZUC, prevents this type of algebraic attack. The third contribution is a new general matrix based model for MAC generation where the input message is injected directly into the internal state. This model describes the accumulation process when the input message is injected directly into the internal state of a nonlinear filter generator. We show that three recently proposed AE stream ciphers can be considered as instances of this model; namely SSS, NLSv2 and SOBER-128. Our model is more general than a previous investigations into direct injection. Possible forgery attacks against this model are investigated. It is shown that using a nonlinear filter in the accumulation process of the input message when either the input message or the initial states of the register is unknown prevents forgery attacks based on collisions. The last contribution is a new general matrix based model for MAC generation where the input message is injected indirectly into the internal state. This model uses the input message as a controller to accumulate a keystream sequence into an accumulation register. We show that three current AE stream ciphers can be considered as instances of this model; namely ZUC, Grain-128a and Sfinks. We establish the conditions under which the model is susceptible to forgery and side-channel attacks.
Resumo:
Bovine intestine samples were heat pump fluidized bed dried at atmospheric pressure and at temperatures below and above the material freezing points equipped with a continuous monitoring system. The investigation of the drying characteristics has been conducted in the temperature range -10~25oC and the airflow in the range 1.5~2.5 m/s. Some experiments were conducted as a single temperature drying experiments and others as two stage drying experiments employing two temperatures. An Arrhenius-type equation was used to interpret the influence of the drying air parameters on the effective diffusivity, calculated with the method of slopes in terms of energy activation, and this was found to be sensitivity of the temperature. The effective diffusion coefficient of moisture transfer was determined by Fickian method using uni-dimensional moisture movement in both moisture, removal by evaporation and combined sublimation and evaporation. Correlations expressing the effective moisture diffusivity and drying temperature are reported.