948 resultados para soft computing methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypertrophic cardiomyopathy (HCM) is a cardiovascular disease where the heart muscle is partially thickened and blood flow is - potentially fatally - obstructed. It is one of the leading causes of sudden cardiac death in young people. Electrocardiography (ECG) and Echocardiography (Echo) are the standard tests for identifying HCM and other cardiac abnormalities. The American Heart Association has recommended using a pre-participation questionnaire for young athletes instead of ECG or Echo tests due to considerations of cost and time involved in interpreting the results of these tests by an expert cardiologist. Initially we set out to develop a classifier for automated prediction of young athletes’ heart conditions based on the answers to the questionnaire. Classification results and further in-depth analysis using computational and statistical methods indicated significant shortcomings of the questionnaire in predicting cardiac abnormalities. Automated methods for analyzing ECG signals can help reduce cost and save time in the pre-participation screening process by detecting HCM and other cardiac abnormalities. Therefore, the main goal of this dissertation work is to identify HCM through computational analysis of 12-lead ECG. ECG signals recorded on one or two leads have been analyzed in the past for classifying individual heartbeats into different types of arrhythmia as annotated primarily in the MIT-BIH database. In contrast, we classify complete sequences of 12-lead ECGs to assign patients into two groups: HCM vs. non-HCM. The challenges and issues we address include missing ECG waves in one or more leads and the dimensionality of a large feature-set. We address these by proposing imputation and feature-selection methods. We develop heartbeat-classifiers by employing Random Forests and Support Vector Machines, and propose a method to classify full 12-lead ECGs based on the proportion of heartbeats classified as HCM. The results from our experiments show that the classifiers developed using our methods perform well in identifying HCM. Thus the two contributions of this thesis are the utilization of computational and statistical methods for discovering shortcomings in a current screening procedure and the development of methods to identify HCM through computational analysis of 12-lead ECG signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-cognitive skills have caught the attention of current education policy writers in Canada. Within the last 10 years, almost every province has produced a document including the importance of supporting non-cognitive skills in K-12 students in the classroom. Although often called different names (such as learning skills, cross curricular competencies, and 20th Century Skills) and occasionally viewed through different lenses (such as emotional intelligence skills, character skills, and work habits), what unifies non-cognitive skills within the policy documents is the claim that students that are strong in these skills are more successful in academic achievement and are more successful in post-secondary endeavors. Though the interest from policy-makers and educators is clear, there are still many questions about non-cognitive skills that have yet to be answered. These include: What skills are the most important for teacher’s to support in the classroom? What are these skills’ exact contributions to student success? How can teachers best support these skills? Are there currently reliable and valid measures of these skills? These are very important questions worth answering if Canadian teachers are expected to support non-cognitive skills in their classrooms with an already burdened workload. As well, it can begin to untangle the plethora of research that exists within the non-cognitive realm. Without a critical look at the current literature, it is impossible to ensure that these policies are effective in Canadian classrooms, and to see an alignment between research and policy. Upon analysis of Canadian curriculum, five non-cognitive skills were found to be the most prevalent among many of the provinces: Self-Regulation, Collaboration, Initiative, Responsibility and Creativity. The available research literature was then examined to determine the utility of teaching these skills in the classroom (can students improve on these skills, do these skills impact other aspects of students’ lives, and are there methods to validly and reliably assess these skills). It was found that Self-Regulation and Initiative had the strongest basis for being implemented in the classroom. On the other hand, Creativity still requires a lot more justification in terms of its impact on students’ lives and ability to assess in the classroom.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an implementation of a method capable of integrating parametric, feature based, CAD models based on commercial software (CATIA) with the SU2 software framework. To exploit the adjoint based methods for aerodynamic optimisation within the SU2, a formulation to obtain geometric sensitivities directly from the commercial CAD parameterisation is introduced, enabling the calculation of gradients with respect to CAD based design variables. To assess the accuracy and efficiency of the alternative approach, two aerodynamic optimisation problems are investigated: an inviscid, 3D, problem with multiple constraints, and a 2D high-lift aerofoil, viscous problem without any constraints. Initial results show the new parameterisation obtaining reliable optimums, with similar levels of performance of the software native parameterisations. In the final paper, details of computing CAD sensitivities will be provided, including accuracy as well as linking geometric sensitivities to aerodynamic objective functions and constraints; the impact in the robustness of the overall method will be assessed and alternative parameterisations will be included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resource Selection (or Query Routing) is an important step in P2P IR. Though analogous to document retrieval in the sense of choosing a relevant subset of resources, resource selection methods have evolved independently from those for document retrieval. Among the reasons for such divergence is that document retrieval targets scenarios where underlying resources are semantically homogeneous, whereas peers would manage diverse content. We observe that semantic heterogeneity is mitigated in the clustered 2-tier P2P IR architecture resource selection layer by way of usage of clustering, and posit that this necessitates a re-look at the applicability of document retrieval methods for resource selection within such a framework. This paper empirically benchmarks document retrieval models against the state-of-the-art resource selection models for the problem of resource selection in the clustered P2P IR architecture, using classical IR evaluation metrics. Our benchmarking study illustrates that document retrieval models significantly outperform other methods for the task of resource selection in the clustered P2P IR architecture. This indicates that clustered P2P IR framework can exploit advancements in document retrieval methods to deliver corresponding improvements in resource selection, indicating potential convergence of these fields for the clustered P2P IR architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Licentiate Thesis is devoted to the presentation and discussion of some new contributions in applied mathematics directed towards scientific computing in sports engineering. It considers inverse problems of biomechanical simulations with rigid body musculoskeletal systems especially in cross-country skiing. This is a contrast to the main research on cross-country skiing biomechanics, which is based mainly on experimental testing alone. The thesis consists of an introduction and five papers. The introduction motivates the context of the papers and puts them into a more general framework. Two papers (D and E) consider studies of real questions in cross-country skiing, which are modelled and simulated. The results give some interesting indications, concerning these challenging questions, which can be used as a basis for further research. However, the measurements are not accurate enough to give the final answers. Paper C is a simulation study which is more extensive than paper D and E, and is compared to electromyography measurements in the literature. Validation in biomechanical simulations is difficult and reducing mathematical errors is one way of reaching closer to more realistic results. Paper A examines well-posedness for forward dynamics with full muscle dynamics. Moreover, paper B is a technical report which describes the problem formulation and mathematical models and simulation from paper A in more detail. Our new modelling together with the simulations enable new possibilities. This is similar to simulations of applications in other engineering fields, and need in the same way be handled with care in order to achieve reliable results. The results in this thesis indicate that it can be very useful to use mathematical modelling and numerical simulations when describing cross-country skiing biomechanics. Hence, this thesis contributes to the possibility of beginning to use and develop such modelling and simulation techniques also in this context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation covers two separate topics in statistical physics. The first part of the dissertation focuses on computational methods of obtaining the free energies (or partition functions) of crystalline solids. We describe a method to compute the Helmholtz free energy of a crystalline solid by direct evaluation of the partition function. In the many-dimensional conformation space of all possible arrangements of N particles inside a periodic box, the energy landscape consists of localized islands corresponding to different solid phases. Calculating the partition function for a specific phase involves integrating over the corresponding island. Introducing a natural order parameter that quantifies the net displacement of particles from lattices sites, we write the partition function in terms of a one-dimensional integral along the order parameter, and evaluate this integral using umbrella sampling. We validate the method by computing free energies of both face-centered cubic (FCC) and hexagonal close-packed (HCP) hard sphere crystals with a precision of $10^{-5}k_BT$ per particle. In developing the numerical method, we find several scaling properties of crystalline solids in the thermodynamic limit. Using these scaling properties, we derive an explicit asymptotic formula for the free energy per particle in the thermodynamic limit. In addition, we describe several changes of coordinates that can be used to separate internal degrees of freedom from external, translational degrees of freedom. The second part of the dissertation focuses on engineering idealized physical devices that work as Maxwell's demon. We describe two autonomous mechanical devices that extract energy from a single heat bath and convert it into work, while writing information onto memory registers. Additionally, both devices can operate as Landauer's eraser, namely they can erase information from a memory register, while energy is dissipated into the heat bath. The phase diagrams and the efficiencies of the two models are solved and analyzed. These two models provide concrete physical illustrations of the thermodynamic consequences of information processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Prediction of soft tissue changes following orthognathic surgery has been frequently attempted in the past decades. It has gradually progressed from the classic “cut and paste” of photographs to the computer assisted 2D surgical prediction planning; and finally, comprehensive 3D surgical planning was introduced to help surgeons and patients to decide on the magnitude and direction of surgical movements as well as the type of surgery to be considered for the correction of facial dysmorphology. A wealth of experience was gained and numerous published literature is available which has augmented the knowledge of facial soft tissue behaviour and helped to improve the ability to closely simulate facial changes following orthognathic surgery. This was particularly noticed following the introduction of the three dimensional imaging into the medical research and clinical applications. Several approaches have been considered to mathematically predict soft tissue changes in three dimensions, following orthognathic surgery. The most common are the Finite element model and Mass tensor Model. These were developed into software packages which are currently used in clinical practice. In general, these methods produce an acceptable level of prediction accuracy of soft tissue changes following orthognathic surgery. Studies, however, have shown a limited prediction accuracy at specific regions of the face, in particular the areas around the lips. Aims The aim of this project is to conduct a comprehensive assessment of hard and soft tissue changes following orthognathic surgery and introduce a new method for prediction of facial soft tissue changes.   Methodology The study was carried out on the pre- and post-operative CBCT images of 100 patients who received their orthognathic surgery treatment at Glasgow dental hospital and school, Glasgow, UK. Three groups of patients were included in the analysis; patients who underwent Le Fort I maxillary advancement surgery; bilateral sagittal split mandibular advancement surgery or bimaxillary advancement surgery. A generic facial mesh was used to standardise the information obtained from individual patient’s facial image and Principal component analysis (PCA) was applied to interpolate the correlations between the skeletal surgical displacement and the resultant soft tissue changes. The identified relationship between hard tissue and soft tissue was then applied on a new set of preoperative 3D facial images and the predicted results were compared to the actual surgical changes measured from their post-operative 3D facial images. A set of validation studies was conducted. To include: • Comparison between voxel based registration and surface registration to analyse changes following orthognathic surgery. The results showed there was no statistically significant difference between the two methods. Voxel based registration, however, showed more reliability as it preserved the link between the soft tissue and skeletal structures of the face during the image registration process. Accordingly, voxel based registration was the method of choice for superimposition of the pre- and post-operative images. The result of this study was published in a refereed journal. • Direct DICOM slice landmarking; a novel technique to quantify the direction and magnitude of skeletal surgical movements. This method represents a new approach to quantify maxillary and mandibular surgical displacement in three dimensions. The technique includes measuring the distance of corresponding landmarks digitized directly on DICOM image slices in relation to three dimensional reference planes. The accuracy of the measurements was assessed against a set of “gold standard” measurements extracted from simulated model surgery. The results confirmed the accuracy of the method within 0.34mm. Therefore, the method was applied in this study. The results of this validation were published in a peer refereed journal. • The use of a generic mesh to assess soft tissue changes using stereophotogrammetry. The generic facial mesh played a major role in the soft tissue dense correspondence analysis. The conformed generic mesh represented the geometrical information of the individual’s facial mesh on which it was conformed (elastically deformed). Therefore, the accuracy of generic mesh conformation is essential to guarantee an accurate replica of the individual facial characteristics. The results showed an acceptable overall mean error of the conformation of generic mesh 1 mm. The results of this study were accepted for publication in peer refereed scientific journal. Skeletal tissue analysis was performed using the validated “Direct DICOM slices landmarking method” while soft tissue analysis was performed using Dense correspondence analysis. The analysis of soft tissue was novel and produced a comprehensive description of facial changes in response to orthognathic surgery. The results were accepted for publication in a refereed scientific Journal. The main soft tissue changes associated with Le Fort I were advancement at the midface region combined with widening of the paranasal, upper lip and nostrils. Minor changes were noticed at the tip of the nose and oral commissures. The main soft tissue changes associated with mandibular advancement surgery were advancement and downward displacement of the chin and lower lip regions, limited widening of the lower lip and slight reversion of the lower lip vermilion combined with minimal backward displacement of the upper lip were recorded. Minimal changes were observed on the oral commissures. The main soft tissue changes associated with bimaxillary advancement surgery were generalized advancement of the middle and lower thirds of the face combined with widening of the paranasal, upper lip and nostrils regions. In Le Fort I cases, the correlation between the changes of the facial soft tissue and the skeletal surgical movements was assessed using PCA. A statistical method known as ’Leave one out cross validation’ was applied on the 30 cases which had Le Fort I osteotomy surgical procedure to effectively utilize the data for the prediction algorithm. The prediction accuracy of soft tissue changes showed a mean error ranging between (0.0006mm±0.582) at the nose region to (-0.0316mm±2.1996) at the various facial regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim of the study: To introduce and describe FlorNExT®, a free cloud computing application to estimate growth and yield of maritime pine (Pinus pinaster Ait.) even-aged stands in the Northeast of Portugal (NE Portugal). Area of study: NE Portugal. Material and methods: FlorNExT® implements a dynamic growth and yield modelling framework which integrates transition functions for dominant height (site index curves) and basal area, as well as output functions for tree and stand volume, biomass, and carbon content. Main results: FlorNExT® is freely available from any device with an Internet connection at: http://flornext.esa.ipb.pt/. Research highlights: This application has been designed to make it possible for any stakeholder to easily estimate standing volume, biomass, and carbon content in maritime pine stands from stand data, as well as to estimate growth and yield based on four stand variables: age, density, dominant height, and basal area. FlorNExT® allows planning thinning treatments. FlorNExT® is a fundamental tool to support forest mobilization at local and regional scales in NE Portugal. Material and methods: FlorNExT® implements a dynamic growth and yield modelling framework which integrates transition functions for dominant height (site index curves) and basal area, as well as output functions for tree and stand volume, biomass, and carbon content. Main results: FlorNExT® is freely available from any device with an Internet connection at: http://flornext.esa.ipb.pt/. Research highlights: This application has been designed to make it possible for any stakeholder to easily estimate standing volume, biomass, and carbon content in maritime pine stands from stand data, as well as to estimate growth and yield based on four stand variables: age, density, dominant height, and basal area. FlorNExT® allows planning thinning treatments. FlorNExT® is a fundamental tool to support forest mobilization at local and regional scales in NE Portugal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual Screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. However, the accuracy of most VS methods is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to improve accuracy of scoring functions used in most VS methods we propose a hybrid novel approach where neural networks (NNET) and support vector machines (SVM) methods are trained with databases of known active (drugs) and inactive compounds, this information being exploited afterwards to improve VS predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we consider the a posteriori error estimation and adaptive mesh refinement of discontinuous Galerkin finite element approximations of the hydrodynamic stability problem associated with the incompressible Navier-Stokes equations. Particular attention is given to the reliable error estimation of the eigenvalue problem in channel and pipe geometries. Here, computable a posteriori error bounds are derived based on employing the generalization of the standard Dual-Weighted-Residual approach, originally developed for the estimation of target functionals of the solution, to eigenvalue/stability problems. The underlying analysis consists of constructing both a dual eigenvalue problem and a dual problem for the original base solution. In this way, errors stemming from both the numerical approximation of the original nonlinear flow problem, as well as the underlying linear eigenvalue problem are correctly controlled. Numerical experiments highlighting the practical performance of the proposed a posteriori error indicator on adaptively refined computational meshes are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we address the question of efficiently solving the algebraic linear system of equations arising from the discretization of a symmetric, elliptic boundary value problem using hp-version discontinuous Galerkin finite element methods. In particular, we introduce a class of domain decomposition preconditioners based on the Schwarz framework, and prove bounds on the condition number of the resulting iteration operators. Numerical results confirming the theoretical estimates are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose To evaluate the possible use of soft contact lenses (CL) to improve the secretagogue role of diadenosine tetraphosphate (Ap4A) promoting tear secretion. Methods Two conventional hydrogel CL (Omafilcon A and Ocufilcon D) and two silicone hydrogel (SiH) CL (Comfilcon A and Balafilcon A) were used. Ap4A was loaded into the lenses by soaking in a 1 mM Ap4A solution during 12 h. In vitro experiments were performed by placing the lenses in multi-wells during 2 h containing 1 ml of ultrapure water. 100 μl aliquots were taken at time zero and every minute for the first 10 min, and then every 15 min. In vivo experiments were performed in New Zealand rabbits and both the dinucleotide release from SiH and tear secretion were measured by means of Schirmer strips and high-pressure liquid chromatography (HPLC) analysis. Results Ap4A in vitro release experiments in hydrogel CL presented a release time 50 (RT50) of 3.9 ± 0.2 min and 3.1 ± 0.1 min for the non-ionic and the ionic CL, respectively. SiH CL released also Ap4A with RT50 values of 5.1 ± 0.1 min for the non-ionic and 2.7 ± 0.1 min for the ionic CL. In vivo experiments with SiH CL showed RT50 values of 9.3 ± 0.2 min and 8.5 ± 0.2 min for the non-ionic and the ionic respectively. The non-ionic lens Ap4A release was able to induce tear secretion above baseline tear levels for almost 360 min. Conclusion The delivery of Ap4A is slower and the effect lasts longer with non-ionic lenses than ionic lenses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate the antibacterial and cytotoxic activities of the secondary metabolites of Lobophytum sp. Methods: Maceration with methanol: chloroform (1:1) was applied to extract the coral material. Chromatographic and spectroscopic techniques were employed for fractionation, isolation and elucidation of pure compounds. Antibacterial activities were performed by well diffusion method against three Gram-positive and four Gram-negative bacteria. Brine shrimp lethality test was employed to predict toxicity, while antitumor activity were tested by 3-(4, 5-dimethylthiazol-2-yl)-2, 5- diphenyltetrazolium bromide (MTT) method against Ehrlich carcinoma cells. Results: Four sesquiterpenes, one cembranoid type diterpenes and two steroids were isolated. 1 exhibited significant antibacterial activity against four tested bacteria (P. aeruginosa, S. aureus, S. epidermis, and S. pneumonia) with MIC value of 15 μg/mL. Moreover, 1 showed high diameter zone of inhibition ranging from 16 - 18 mm against test bacteria. Compounds 4 and 5 displayed moderate antibacterial activity against all test bacteria with inhibition zone diameter (IZD) ranging from 11 – 15 mm and MIC values of 30 μg/mL. 2, 3, 6 and 7 exhibited weak antibacterial activity (IZD, 7 - 11 mm; MIC ≥ 30 μg/mL). In addition, only diterpene compound (4) showed high toxicity against A. Salina and antitumor activity against Erhlich carcinoma cells with the LD50 of 25 and 50 μg/mL, respectively. Conclusion: This study reveals the strong antibacterial activity of sesquiterpene alismol (1) and the potential antibacterial and antitumor activity of cembranoid type diterpene, cembrene A (4).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The idea of spacecraft formations, flying in tight configurations with maximum baselines of a few hundred meters in low-Earth orbits, has generated widespread interest over the last several years. Nevertheless, controlling the movement of spacecraft in formation poses difficulties, such as in-orbit high-computing demand and collision avoidance capabilities, which escalate as the number of units in the formation is increased and complicated nonlinear effects are imposed to the dynamics, together with uncertainty which may arise from the lack of knowledge of system parameters. These requirements have led to the need of reliable linear and nonlinear controllers in terms of relative and absolute dynamics. The objective of this thesis is, therefore, to introduce new control methods to allow spacecraft in formation, with circular/elliptical reference orbits, to efficiently execute safe autonomous manoeuvres. These controllers distinguish from the bulk of literature in that they merge guidance laws never applied before to spacecraft formation flying and collision avoidance capacities into a single control strategy. For this purpose, three control schemes are presented: linear optimal regulation, linear optimal estimation and adaptive nonlinear control. In general terms, the proposed control approaches command the dynamical performance of one or several followers with respect to a leader to asymptotically track a time-varying nominal trajectory (TVNT), while the threat of collision between the followers is reduced by repelling accelerations obtained from the collision avoidance scheme during the periods of closest proximity. Linear optimal regulation is achieved through a Riccati-based tracking controller. Within this control strategy, the controller provides guidance and tracking toward a desired TVNT, optimizing fuel consumption by Riccati procedure using a non-infinite cost function defined in terms of the desired TVNT, while repelling accelerations generated from the CAS will ensure evasive actions between the elements of the formation. The relative dynamics model, suitable for circular and eccentric low-Earth reference orbits, is based on the Tschauner and Hempel equations, and includes a control input and a nonlinear term corresponding to the CAS repelling accelerations. Linear optimal estimation is built on the forward-in-time separation principle. This controller encompasses two stages: regulation and estimation. The first stage requires the design of a full state feedback controller using the state vector reconstructed by means of the estimator. The second stage requires the design of an additional dynamical system, the estimator, to obtain the states which cannot be measured in order to approximately reconstruct the full state vector. Then, the separation principle states that an observer built for a known input can also be used to estimate the state of the system and to generate the control input. This allows the design of the observer and the feedback independently, by exploiting the advantages of linear quadratic regulator theory, in order to estimate the states of a dynamical system with model and sensor uncertainty. The relative dynamics is described with the linear system used in the previous controller, with a control input and nonlinearities entering via the repelling accelerations from the CAS during collision avoidance events. Moreover, sensor uncertainty is added to the control process by considering carrier-phase differential GPS (CDGPS) velocity measurement error. An adaptive control law capable of delivering superior closed-loop performance when compared to the certainty-equivalence (CE) adaptive controllers is finally presented. A novel noncertainty-equivalence controller based on the Immersion and Invariance paradigm for close-manoeuvring spacecraft formation flying in both circular and elliptical low-Earth reference orbits is introduced. The proposed control scheme achieves stabilization by immersing the plant dynamics into a target dynamical system (or manifold) that captures the desired dynamical behaviour. They key feature of this methodology is the addition of a new term to the classical certainty-equivalence control approach that, in conjunction with the parameter update law, is designed to achieve adaptive stabilization. This parameter has the ultimate task of shaping the manifold into which the adaptive system is immersed. The performance of the controller is proven stable via a Lyapunov-based analysis and Barbalat’s lemma. In order to evaluate the design of the controllers, test cases based on the physical and orbital features of the Prototype Research Instruments and Space Mission Technology Advancement (PRISMA) are implemented, extending the number of elements in the formation into scenarios with reconfigurations and on-orbit position switching in elliptical low-Earth reference orbits. An extensive analysis and comparison of the performance of the controllers in terms of total Δv and fuel consumption, with and without the effects of the CAS, is presented. These results show that the three proposed controllers allow the followers to asymptotically track the desired nominal trajectory and, additionally, those simulations including CAS show an effective decrease of collision risk during the performance of the manoeuvre.