209 resultados para Simplified adhesive
Resumo:
The mean action time is the mean of a probability density function that can be interpreted as a critical time, which is a finite estimate of the time taken for the transient solution of a reaction-diffusion equation to effectively reach steady state. For high-variance distributions, the mean action time under-approximates the critical time since it neglects to account for the spread about the mean. We can improve our estimate of the critical time by calculating the higher moments of the probability density function, called the moments of action, which provide additional information regarding the spread about the mean. Existing methods for calculating the nth moment of action require the solution of n nonhomogeneous boundary value problems which can be difficult and tedious to solve exactly. Here we present a simplified approach using Laplace transforms which allows us to calculate the nth moment of action without solving this family of boundary value problems and also without solving for the transient solution of the underlying reaction-diffusion problem. We demonstrate the generality of our method by calculating exact expressions for the moments of action for three problems from the biophysics literature. While the first problem we consider can be solved using existing methods, the second problem, which is readily solved using our approach, is intractable using previous techniques. The third problem illustrates how the Laplace transform approach can be used to study coupled linear reaction-diffusion equations.
Resumo:
The biosafety of carbon nanomaterial needs to be critically evaluated with both experimental and theoretical validations before extensive biomedical applications. In this letter, we present an analysis of the binding ability of two dimensional monolayer carbon nanomaterial on actin by molecular simulation to understand their adhesive characteristics on F-actin cytoskeleton. The modelling results indicate that the positively charged carbon nanomaterial has higher binding stability on actin. Compared to crystalline graphene, graphene oxide shows higher binding influence on actin when carrying positive surface charge. This theoretical investigation provides insights into the sensitivity of actin-related cellular activities on carbon nanomaterial.
Resumo:
A business process is often modeled using some kind of a directed flow graph, which we call a workflow graph. The Refined Process Structure Tree (RPST) is a technique for workflow graph parsing, i.e., for discovering the structure of a workflow graph, which has various applications. In this paper, we provide two improvements to the RPST. First, we propose an alternative way to compute the RPST that is simpler than the one developed originally. In particular, the computation reduces to constructing the tree of the triconnected components of a workflow graph in the special case when every node has at most one incoming or at most one outgoing edge. Such graphs occur frequently in applications. Secondly, we extend the applicability of the RPST. Originally, the RPST was applicable only to graphs with a single source and single sink such that the completed version of the graph is biconnected. We lift both restrictions. Therefore, the RPST is then applicable to arbitrary directed graphs such that every node is on a path from some source to some sink. This includes graphs with multiple sources and/or sinks and disconnected graphs.
Resumo:
In this paper we analyse the role of some of the building blocks of SHA-256. We show that the disturbance-correction strategy is applicable to the SHA-256 architecture and we prove that functions Σ, σ are vital for the security of SHA-256 by showing that for a variant without them it is possible to find collisions with complexity 2^64 hash operations. As a step towards an analysis of the full function, we present the results of our experiments on Hamming weights of expanded messages for different variants of the message expansion and show that there exist low-weight expanded messages for XOR-linearised variants.
Resumo:
A nonlinear interface element modelling method is formulated for the prediction of deformation and failure of high adhesive thin layer polymer mortared masonry exhibiting failure of units and mortar. Plastic flow vectors are explicitly integrated within the implicit finite element framework instead of relying on predictor–corrector like approaches. The method is calibrated using experimental data from uniaxial compression, shear triplet and flexural beam tests. The model is validated using a thin layer mortared masonry shear wall, whose experimental datasets are reported in the literature and is used to examine the behaviour of thin layer mortared masonry under biaxial loading.
Resumo:
Objective: To prospectively test two simplified peer review processes, estimate the agreement between the simplified and official processes, and compare the costs of peer review. Design, participants and setting: A prospective parallel study of Project Grant proposals submitted in 2013 to the National Health and Medical Research Council (NHMRC) of Australia. The official funding outcomes were compared with two simplified processes using proposals in Public Health and Basic Science. The two simplified processes were: panels of 7 reviewers who met face-to-face and reviewed only the nine-page research proposal and track record (simplified panel); and 2 reviewers who independently reviewed only the nine-page research proposal (journal panel). The official process used panels of 12 reviewers who met face-to-face and reviewed longer proposals of around 100 pages. We compared the funding outcomes of 72 proposals that were peer reviewed by the simplified and official processes. Main outcome measures: Agreement in funding outcomes; costs of peer review based on reviewers’ time and travel costs. Results: The agreement between the simplified and official panels (72%, 95% CI 61% to 82%), and the journal and official panels (74%, 62% to 83%), was just below the acceptable threshold of 75%. Using the simplified processes would save $A2.1–$A4.9 million per year in peer review costs. Conclusions: Using shorter applications and simpler peer review processes gave reasonable agreement with the more complex official process. Simplified processes save time and money that could be reallocated to actual research. Funding agencies should consider streamlining their application processes.
Resumo:
Raman spectroscopic analyses of fragmented wall-painting specimens from a Romano-British villa dating from ca. 200 AD are reported. The predominant pigment is red haematite, to which carbon, chalk and sand have been added to produce colour variations, applied to a typical Roman limewash putty composition. Other pigment colours are identified as white chalk, yellow (goethite), grey (soot/chalk mixture) and violet. The latter pigment is ascribed to caput mortuum, a rare form of haematite, to which kaolinite (possibly from Cornwall) has been added, presumably in an effort to increase the adhesive properties of the pigment to the substratum. This is the first time that kaolinite has been reported in this context and could indicate the successful application of an ancient technology discovered by the Romano-British artists. Supporting evidence for the Raman data is provided by X-ray diffraction and SEM-EDAX analyses of the purple pigment.
Resumo:
Experiments were undertaken to study drying kinetics of moist cylindrical shaped food particulates during fluidised bed drying. Cylindrical particles were prepared from Green beans with three different length:diameter ratios, 3:1, 2:1 and 1:1. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.
Resumo:
This paper presents a detailed description of the influence of critical parameters that govern the vulnerability of columns under lateral impact loads. Numerical simulations are conducted by using the Finite Element program LS-DYNA, incorporating steel reinforcement, material models and strain rate effects. A simplified method based on impact pulse generated from full scale impact tests is used for impact reconstruction and effects of the various pulse loading parameters are investigated under low to medium velocity impacts. A constitutive material model which can simulate failures under tri-axial state of stresses is used for concrete. Confinement effects are also introduced to the numerical simulation and columns of Grade 30 to 50 concrete under pure axial loading are analysed in detail. This research confirmed that the vulnerability of the axially loaded columns can be mitigated by reducing the slenderness ratio and concrete grade, and by choosing the design option with a minimal amount of longitudinal steel. Additionally, it is evident that approximately a 50% increase in impact capacity can be gained for columns in medium rise buildings by enhancing the confinement effects alone. Results also indicated that the ductility as well as the mode of failure under impact can be changed with the volumetric ratio of lateral steel. Moreover, to increase the impact capacity of the vulnerable columns, a higher confining stress is required. The general provisions of current design codes do not sufficiently cover this aspect and hence this research will provide additional guidelines to overcome the inadequacies of code provisions.
Resumo:
Real-Time Kinematic (RTK) positioning is a technique used to provide precise positioning services at centimetre accuracy level in the context of Global Navigation Satellite Systems (GNSS). While a Network-based RTK (N-RTK) system involves multiple continuously operating reference stations (CORS), the simplest form of a NRTK system is a single-base RTK. In Australia there are several NRTK services operating in different states and over 1000 single-base RTK systems to support precise positioning applications for surveying, mining, agriculture, and civil construction in regional areas. Additionally, future generation GNSS constellations, including modernised GPS, Galileo, GLONASS, and Compass, with multiple frequencies have been either developed or will become fully operational in the next decade. A trend of future development of RTK systems is to make use of various isolated operating network and single-base RTK systems and multiple GNSS constellations for extended service coverage and improved performance. Several computational challenges have been identified for future NRTK services including: • Multiple GNSS constellations and multiple frequencies • Large scale, wide area NRTK services with a network of networks • Complex computation algorithms and processes • Greater part of positioning processes shifting from user end to network centre with the ability to cope with hundreds of simultaneous users’ requests (reverse RTK) There are two major requirements for NRTK data processing based on the four challenges faced by future NRTK systems, expandable computing power and scalable data sharing/transferring capability. This research explores new approaches to address these future NRTK challenges and requirements using the Grid Computing facility, in particular for large data processing burdens and complex computation algorithms. A Grid Computing based NRTK framework is proposed in this research, which is a layered framework consisting of: 1) Client layer with the form of Grid portal; 2) Service layer; 3) Execution layer. The user’s request is passed through these layers, and scheduled to different Grid nodes in the network infrastructure. A proof-of-concept demonstration for the proposed framework is performed in a five-node Grid environment at QUT and also Grid Australia. The Networked Transport of RTCM via Internet Protocol (Ntrip) open source software is adopted to download real-time RTCM data from multiple reference stations through the Internet, followed by job scheduling and simplified RTK computing. The system performance has been analysed and the results have preliminarily demonstrated the concepts and functionality of the new NRTK framework based on Grid Computing, whilst some aspects of the performance of the system are yet to be improved in future work.
Resumo:
Research has noted a ‘pronounced pattern of increase with increasing remoteness' of death rates in road crashes. However, crash characteristics by remoteness are not commonly or consistently reported, with definitions of rural and urban often relying on proxy representations such as prevailing speed limit. The current paper seeks to evaluate the efficacy of the Accessibility / Remoteness Index of Australia (ARIA+) to identifying trends in road crashes. ARIA+ does not rely on road-specific measures and uses distances to populated centres to attribute a score to an area, which can in turn be grouped into 5 classifications of increasing remoteness. The current paper uses applications of these classifications at the broad level of Australian Bureau of Statistics' Statistical Local Areas, thus avoiding precise crash locating or dedicated mapping software. Analyses used Queensland road crash database details for all 31,346 crashes resulting in a fatality or hospitalisation occurring between 1st July, 2001 and 30th June 2006 inclusive. Results showed that this simplified application of ARIA+ aligned with previous definitions such as speed limit, while also providing further delineation. Differences in crash contributing factors were noted with increasing remoteness such as a greater representation of alcohol and ‘excessive speed for circumstances.' Other factors such as the predominance of younger drivers in crashes differed little by remoteness classification. The results are discussed in terms of the utility of remoteness as a graduated rather than binary (rural/urban) construct and the potential for combining ARIA crash data with census and hospital datasets.
Resumo:
The application of object-based approaches to the problem of extracting vegetation information from images requires accurate delineation of individual tree crowns. This paper presents an automated method for individual tree crown detection and delineation by applying a simplified PCNN model in spectral feature space followed by post-processing using morphological reconstruction. The algorithm was tested on high resolution multi-spectral aerial images and the results are compared with two existing image segmentation algorithms. The results demonstrate that our algorithm outperforms the other two solutions with the average accuracy of 81.8%.
Resumo:
"For every complex problem there is a solution that is simple, neat and wrong (M.L. Mencken, US writer and social commentator). Nowhere is this quote more apt than when applied to finding over-simplified solutions to the complex problem of looking after the safety and well-being of vulnerable children. The easiest formula is, of course, to ‘rescue children from dysfunctional families’, a line taken recently in the monograph by the right wing think tank, Centre for Independent Studies (Sammut & O’Brien 2009). It is reasoning with fatal flaws. This commentary provides a timely reminder of the strong arguments which lie behind the national and international shift to supporting children and families through universal and specialist community-based services, rather than weighting all resources into statutory child protection interventions. A brief outline of the value of developing the resources to support children in their families, and the problems with 'rescuing' children through the child protection system are discussed.
Resumo:
Grid music systems provide discrete geometric methods for simplified music-making by providing spatialised input to construct patterned music on a 2D matrix layout. While they are conceptually simple, grid systems may be layered to enable complex and satisfying musical results. Grid music systems have been applied to a range of systems from small portable devices up to larger systems. In this paper we will discuss the use of grid music systems in general and present an overview of the HarmonyGrid system we have developed as a new interactive performance system. We discuss a range of issues related to the design and use of larger-scale grid- based interactive performance systems such as the HarmonyGrid.
Resumo:
The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.