663 resultados para Harmonic Balance method
Resumo:
Acid hydrolysis is a popular pretreatment for removing hemicellulose from lignocelluloses in order to produce a digestible substrate for enzymatic saccharification. In this work, a novel model for the dilute acid hydrolysis of hemicellulose within sugarcane bagasse is presented and calibrated against experimental oligomer profiles. The efficacy of mathematical models as hydrolysis yield predictors and as vehicles for investigating the mechanisms of acid hydrolysis is also examined. Experimental xylose, oligomer (degree of polymerisation 2 to 6) and furfural yield profiles were obtained for bagasse under dilute acid hydrolysis conditions at temperatures ranging from 110C to 170C. Population balance kinetics, diffusion and porosity evolution were incorporated into a mathematical model of the acid hydrolysis of sugarcane bagasse. This model was able to produce a good fit to experimental xylose yield data with only three unknown kinetic parameters ka, kb and kd. However, fitting this same model to an expanded data set of oligomeric and furfural yield profiles did not successfully reproduce the experimental results. It was found that a ``hard-to-hydrolyse'' parameter, $\alpha$, was required in the model to ensure reproducibility of the experimental oligomer profiles at 110C, 125C and 140C. The parameters obtained through the fitting exercises at lower temperatures were able to be used to predict the oligomer profiles at 155C and 170C with promising results. The interpretation of kinetic parameters obtained by fitting a model to only a single set of data may be ambiguous. Although these parameters may correctly reproduce the data, they may not be indicative of the actual rate parameters, unless some care has been taken to ensure that the model describes the true mechanisms of acid hydrolysis. It is possible to challenge the robustness of the model by expanding the experimental data set and hence limiting the parameter space for the fitting parameters. The novel combination of ``hard-to-hydrolyse'' and population balance dynamics in the model presented here appears to stand up to such rigorous fitting constraints.
Resumo:
Underground transport tunnels are vulnerable to blast events. This paper develops and applies a fully coupled technique involving the Smooth Particle Hydrodynamics and Finite Element techniques to investigate the blast response of segmented bored tunnels. Findings indicate that several bolts failed in the longitudinal direction due to redistribution of blast loading to adjacent tunnel rings. The tunnel segments respond as arch mechanisms in the transverse direction and suffered damage mainly due to high bending stresses. The novel information from the present study will enable safer designs of buried tunnels and provide a benchmark reference for future developments in this area.
Resumo:
Producers, technicians, performers, audiences and critics are all critical components of the performing arts ecology – critical components of an ecosystem that have to come together into some sort of productive relationship if the performing arts are to be vital, viable and successful. Different performance practices developed in different times, spaces and places do, of course, connect these players in different ways as part of their attempt to achieve their own definition of success, be it based on entertainment, educational, expression, empowerment, or something else. In some contemporary performance practices, social media platforms, applications and processes are seen to have significant potential to restore balance to the relationship between performer and audience, providing audiences with more power to participate in a performance event. In this paper, I investigate prevailing assumptions about social media’s power to democratise performance practice, or, at least, develop more co-creative performance practices in which producers, performers and audiences participate actively before, during and after the event. I focus, in particular, on the use of social media as a means of developing a participatory aesthetic in which an audience member is asked to contribute to the cast of characters, plot or progression of a performance. Although diverse – from performances streamed online, to performances that offer transmedia components the audience can use to learn more about character, context and plot online, to performances that incorporate online voting, liking or linking, to performances that unfold fully online on websites, blogs, microblogs or other social media platforms – what a lot of uses of social media in contemporary performance today share is a desire to encourage audiences to reflect on their role in making, and making meaning, of the event. In this paper I interrogate if, and if so how, this democratises or develops deeper levels of co-creativity in the relationship between producers, performers and audiences.
Resumo:
For wind farm optimizations with lands belonging to different owners, the traditional penalty method is highly dependent on the type of wind farm land division. The application of the traditional method can be cumbersome if the divisions are complex. To overcome this disadvantage, a new method is proposed in this paper for the first time. Unlike the penalty method which requires the addition of penalizing term when evaluating the fitness function, it is achieved through repairing the infeasible solutions before fitness evaluation. To assess the effectiveness of the proposed method on the optimization of wind farm, the optimizing results of different methods are compared for three different types of wind farm division. Different wind scenarios are also incorporated during optimization which includes (i) constant wind speed and wind direction; (ii) various wind speed and wind direction, and; (iii) the more realisticWeibull distribution. Results show that the performance of the new method varies for different land plots in the tested cases. Nevertheless, it is found that optimum or at least close to optimum results can be obtained with sequential land plot study using the new method for all cases. It is concluded that satisfactory results can be achieved using the proposed method. In addition, it has the advantage of flexibility in managing the wind farm design, which not only frees users to define the penalty parameter but without limitations on the wind farm division.
Resumo:
With the extensive use of rating systems in the web, and their significance in decision making process by users, the need for more accurate aggregation methods has emerged. The Naïve aggregation method, using the simple mean, is not adequate anymore in providing accurate reputation scores for items [6 ], hence, several researches where conducted in order to provide more accurate alternative aggregation methods. Most of the current reputation models do not consider the distribution of ratings across the different possible ratings values. In this paper, we propose a novel reputation model, which generates more accurate reputation scores for items by deploying the normal distribution over ratings. Experiments show promising results for our proposed model over state-of-the-art ones on sparse and dense datasets.
Resumo:
Not a lot is known about most mental illness. Its triggers can rarely be established and nor can its aetiological dynamics, so it is hardly surprising that the accepted treatments for most mental illnesses are really strategies to manage the most overt symptoms. But with such a dearth of knowledge, how can worthy decisions be made about psychiatric interventions, especially given time and budgetary restrictions? This paper introduces a method, extrapolated from Salutogenics; the psycho-social theory of health introduced by Antonovsky in 1987. This method takes a normative stance (that psychiatric health care is for the betterment of psychiatric patients), and applies it to any context where there is a dearth of workable knowledge. In lieu of guiding evidence, the method identifies reasonable alternatives on the fly, enabling rational decisions to be made quickly with limited resources.
Resumo:
This paper deals with a finite element modelling method for thin layer mortared masonry systems. In this method, the mortar layers including the interfaces are represented using a zero thickness interface element and the masonry units are modelled using an elasto-plastic, damaging solid element. The interface element is formulated using two regimes; i) shear-tension and ii) shearcompression. In the shear-tension regime, the failure of joint is consiedered through an eliptical failure criteria and in shear-compression it is considered through Mohr Coulomb type failure criterion. An explicit integration scheme is used in an implicit finite element framework for the formulation of the interface element. The model is calibrated with an experimental dataset from thin layer mortared masonry prism subjected to uniaxial compression, a triplet subjected to shear loads a beam subjected to flexural loads and used to predict the response of thin layer mortared masonry wallettes under orthotropic loading. The model is found to simulate the behaviour of a thin layer mortated masonry shear wall tested under pre-compression and inplane shear quite adequately. The model is shown to reproduce the failure of masonry panels under uniform biaxial state of stresses.
Resumo:
Particle swarm optimization (PSO), a new population based algorithm, has recently been used on multi-robot systems. Although this algorithm is applied to solve many optimization problems as well as multi-robot systems, it has some drawbacks when it is applied on multi-robot search systems to find a target in a search space containing big static obstacles. One of these defects is premature convergence. This means that one of the properties of basic PSO is that when particles are spread in a search space, as time increases they tend to converge in a small area. This shortcoming is also evident on a multi-robot search system, particularly when there are big static obstacles in the search space that prevent the robots from finding the target easily; therefore, as time increases, based on this property they converge to a small area that may not contain the target and become entrapped in that area.Another shortcoming is that basic PSO cannot guarantee the global convergence of the algorithm. In other words, initially particles explore different areas, but in some cases they are not good at exploiting promising areas, which will increase the search time.This study proposes a method based on the particle swarm optimization (PSO) technique on a multi-robot system to find a target in a search space containing big static obstacles. This method is not only able to overcome the premature convergence problem but also establishes an efficient balance between exploration and exploitation and guarantees global convergence, reducing the search time by combining with a local search method, such as A-star.To validate the effectiveness and usefulness of algorithms,a simulation environment has been developed for conducting simulation-based experiments in different scenarios and for reporting experimental results. These experimental results have demonstrated that the proposed method is able to overcome the premature convergence problem and guarantee global convergence.
Resumo:
Background and purpose There are no published studies on the parameterisation and reliability of the single-leg stance (SLS) test with inertial sensors in stroke patients. Purpose: to analyse the reliability (intra-observer/inter-observer) and sensitivity of inertial sensors used for the SLS test in stroke patients. Secondary objective: to compare the records of the two inertial sensors (trunk and lumbar) to detect any significant differences in the kinematic data obtained in the SLS test. Methods Design: cross-sectional study. While performing the SLS test, two inertial sensors were placed at lumbar (L5-S1) and trunk regions (T7–T8). Setting: Laboratory of Biomechanics (Health Science Faculty - University of Málaga). Participants: Four chronic stroke survivors (over 65 yrs old). Measurement: displacement and velocity, Rotation (X-axis), Flexion/Extension (Y-axis), Inclination (Z-axis); Resultant displacement and velocity (V): RV=(Vx2+Vy2+Vz2)−−−−−−−−−−−−−−−−−√ Along with SLS kinematic variables, descriptive analyses, differences between sensors locations and intra-observer and inter-observer reliability were also calculated. Results Differences between the sensors were significant only for left inclination velocity (p = 0.036) and extension displacement in the non-affected leg with eyes open (p = 0.038). Intra-observer reliability of the trunk sensor ranged from 0.889-0.921 for the displacement and 0.849-0.892 for velocity. Intra-observer reliability of the lumbar sensor was between 0.896-0.949 for the displacement and 0.873-0.894 for velocity. Inter-observer reliability of the trunk sensor was between 0.878-0.917 for the displacement and 0.847-0.884 for velocity. Inter-observer reliability of the lumbar sensor ranged from 0.870-0.940 for the displacement and 0.863-0.884 for velocity. Conclusion There were no significant differences between the kinematic records made by an inertial sensor during the development of the SLS testing between two inertial sensors placed in the lumbar and thoracic regions. In addition, inertial sensors. Have the potential to be reliable, valid and sensitive instruments for kinematic measurements during SLS testing but further research is needed.
Resumo:
Dorsiflexion (DF) of the foot plays an essential role in both controlling balance and human gait. Electromyography and Sonomyography can provide information on several aspects of muscle function. The aim was to describe a new method for real-time monitoring of muscular activity, as measured using EMG, muscular architecture, as measured using SMG, force, as measured using dynamometry, and kinematic parameters, as measured using IS during isometric and isotonic contractions of the foot DF. The present methodology may be clinically relevant because it involves a reproducible procedure which allows the function and structure of the foot DF to be monitored.
Resumo:
We incorporated a new Riemannian fluid registration algorithm into a general MRI analysis method called tensor-based morphometry to map the heritability of brain morphology in MR images from 23 monozygotic and 23 dizygotic twin pairs. All 92 3D scans were fluidly registered to a common template. Voxelwise Jacobian determinants were computed from the deformation fields to assess local volumetric differences across subjects. Heritability maps were computed from the intraclass correlations and their significance was assessed using voxelwise permutation tests. Lobar volume heritability was also studied using the ACE genetic model. The performance of this Riemannian algorithm was compared to a more standard fluid registration algorithm: 3D maps from both registration techniques displayed similar heritability patterns throughout the brain. Power improvements were quantified by comparing the cumulative distribution functions of the p-values generated from both competing methods. The Riemannian algorithm outperformed the standard fluid registration.
Resumo:
In structural brain MRI, group differences or changes in brain structures can be detected using Tensor-Based Morphometry (TBM). This method consists of two steps: (1) a non-linear registration step, that aligns all of the images to a common template, and (2) a subsequent statistical analysis. The numerous registration methods that have recently been developed differ in their detection sensitivity when used for TBM, and detection power is paramount in epidemological studies or drug trials. We therefore developed a new fluid registration method that computes the mappings and performs statistics on them in a consistent way, providing a bridge between TBM registration and statistics. We used the Log-Euclidean framework to define a new regularizer that is a fluid extension of the Riemannian elasticity, which assures diffeomorphic transformations. This regularizer constrains the symmetrized Jacobian matrix, also called the deformation tensor. We applied our method to an MRI dataset from 40 fraternal and identical twins, to revealed voxelwise measures of average volumetric differences in brain structure for subjects with different degrees of genetic resemblance.
Resumo:
3D registration of brain MRI data is vital for many medical imaging applications. However, purely intensitybased approaches for inter-subject matching of brain structure are generally inaccurate in cortical regions, due to the highly complex network of sulci and gyri, which vary widely across subjects. Here we combine a surfacebased cortical registration with a 3D fluid one for the first time, enabling precise matching of cortical folds, but allowing large deformations in the enclosed brain volume, which guarantee diffeomorphisms. This greatly improves the matching of anatomy in cortical areas. The cortices are segmented and registered with the software Freesurfer. The deformation field is initially extended to the full 3D brain volume using a 3D harmonic mapping that preserves the matching between cortical surfaces. Finally, these deformation fields are used to initialize a 3D Riemannian fluid registration algorithm, that improves the alignment of subcortical brain regions. We validate this method on an MRI dataset from 92 healthy adult twins. Results are compared to those based on volumetric registration without surface constraints; the resulting mean templates resolve consistent anatomical features both subcortically and at the cortex, suggesting that the approach is well-suited for cross-subject integration of functional and anatomic data.
Resumo:
We propose in this paper a new method for the mapping of hippocampal (HC) surfaces to establish correspondences between points on HC surfaces and enable localized HC shape analysis. A novel geometric feature, the intrinsic shape context, is defined to capture the global characteristics of the HC shapes. Based on this intrinsic feature, an automatic algorithm is developed to detect a set of landmark curves that are stable across population. The direct map between a source and target HC surface is then solved as the minimizer of a harmonic energy function defined on the source surface with landmark constraints. For numerical solutions, we compute the map with the approach of solving partial differential equations on implicit surfaces. The direct mapping method has the following properties: (1) it has the advantage of being automatic; (2) it is invariant to the pose of HC shapes. In our experiments, we apply the direct mapping method to study temporal changes of HC asymmetry in Alzheimer's disease (AD) using HC surfaces from 12 AD patients and 14 normal controls. Our results show that the AD group has a different trend in temporal changes of HC asymmetry than the group of normal controls. We also demonstrate the flexibility of the direct mapping method by applying it to construct spherical maps of HC surfaces. Spherical harmonics (SPHARM) analysis is then applied and it confirms our results on temporal changes of HC asymmetry in AD.