981 resultados para Halley’s and Euler-Chebyshev’s Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is still a lack of an engineering approach for building Web systems, and the field of measuring the Web is not yet mature. In particular, there is an uncertainty in the selection of evaluation methods, and there are risks of standardizing inadequate evaluation practices. It is important to know whether we are evaluating the Web or specific website(s). We need a new categorization system, a different focus on evaluation methods, and an in-depth analysis that reveals the strengths and weaknesses of each method. As a contribution to the field of Web evaluation, this study proposes a novel approach to view and select evaluation methods based on the purpose and platforms of the evaluation. It has been shown that the choice of the appropriate evaluation method(s) depends greatly on the purpose of the evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

International audience

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review our work on generalisations of the Becker-Doring model of cluster-formation as applied to nucleation theory, polymer growth kinetics, and the formation of upramolecular structures in colloidal chemistry. One valuable tool in analysing mathematical models of these systems has been the coarse-graining approximation which enables macroscopic models for observable quantities to be derived from microscopic ones. This permits assumptions about the detailed molecular mechanisms to be tested, and their influence on the large-scale kinetics of surfactant self-assembly to be elucidated. We also summarise our more recent results on Becker-Doring systems, notably demonstrating that cross-inhibition and autocatalysis can destabilise a uniform solution and lead to a competitive environment in which some species flourish at the expense of others, phenomena relevant in models of the origins of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The idea of spacecraft formations, flying in tight configurations with maximum baselines of a few hundred meters in low-Earth orbits, has generated widespread interest over the last several years. Nevertheless, controlling the movement of spacecraft in formation poses difficulties, such as in-orbit high-computing demand and collision avoidance capabilities, which escalate as the number of units in the formation is increased and complicated nonlinear effects are imposed to the dynamics, together with uncertainty which may arise from the lack of knowledge of system parameters. These requirements have led to the need of reliable linear and nonlinear controllers in terms of relative and absolute dynamics. The objective of this thesis is, therefore, to introduce new control methods to allow spacecraft in formation, with circular/elliptical reference orbits, to efficiently execute safe autonomous manoeuvres. These controllers distinguish from the bulk of literature in that they merge guidance laws never applied before to spacecraft formation flying and collision avoidance capacities into a single control strategy. For this purpose, three control schemes are presented: linear optimal regulation, linear optimal estimation and adaptive nonlinear control. In general terms, the proposed control approaches command the dynamical performance of one or several followers with respect to a leader to asymptotically track a time-varying nominal trajectory (TVNT), while the threat of collision between the followers is reduced by repelling accelerations obtained from the collision avoidance scheme during the periods of closest proximity. Linear optimal regulation is achieved through a Riccati-based tracking controller. Within this control strategy, the controller provides guidance and tracking toward a desired TVNT, optimizing fuel consumption by Riccati procedure using a non-infinite cost function defined in terms of the desired TVNT, while repelling accelerations generated from the CAS will ensure evasive actions between the elements of the formation. The relative dynamics model, suitable for circular and eccentric low-Earth reference orbits, is based on the Tschauner and Hempel equations, and includes a control input and a nonlinear term corresponding to the CAS repelling accelerations. Linear optimal estimation is built on the forward-in-time separation principle. This controller encompasses two stages: regulation and estimation. The first stage requires the design of a full state feedback controller using the state vector reconstructed by means of the estimator. The second stage requires the design of an additional dynamical system, the estimator, to obtain the states which cannot be measured in order to approximately reconstruct the full state vector. Then, the separation principle states that an observer built for a known input can also be used to estimate the state of the system and to generate the control input. This allows the design of the observer and the feedback independently, by exploiting the advantages of linear quadratic regulator theory, in order to estimate the states of a dynamical system with model and sensor uncertainty. The relative dynamics is described with the linear system used in the previous controller, with a control input and nonlinearities entering via the repelling accelerations from the CAS during collision avoidance events. Moreover, sensor uncertainty is added to the control process by considering carrier-phase differential GPS (CDGPS) velocity measurement error. An adaptive control law capable of delivering superior closed-loop performance when compared to the certainty-equivalence (CE) adaptive controllers is finally presented. A novel noncertainty-equivalence controller based on the Immersion and Invariance paradigm for close-manoeuvring spacecraft formation flying in both circular and elliptical low-Earth reference orbits is introduced. The proposed control scheme achieves stabilization by immersing the plant dynamics into a target dynamical system (or manifold) that captures the desired dynamical behaviour. They key feature of this methodology is the addition of a new term to the classical certainty-equivalence control approach that, in conjunction with the parameter update law, is designed to achieve adaptive stabilization. This parameter has the ultimate task of shaping the manifold into which the adaptive system is immersed. The performance of the controller is proven stable via a Lyapunov-based analysis and Barbalat’s lemma. In order to evaluate the design of the controllers, test cases based on the physical and orbital features of the Prototype Research Instruments and Space Mission Technology Advancement (PRISMA) are implemented, extending the number of elements in the formation into scenarios with reconfigurations and on-orbit position switching in elliptical low-Earth reference orbits. An extensive analysis and comparison of the performance of the controllers in terms of total Δv and fuel consumption, with and without the effects of the CAS, is presented. These results show that the three proposed controllers allow the followers to asymptotically track the desired nominal trajectory and, additionally, those simulations including CAS show an effective decrease of collision risk during the performance of the manoeuvre.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we carried out a comparative analysis between two classical methodologies to prospect residue contacts in proteins: the traditional cutoff dependent (CD) approach and cutoff free Delaunay tessellation (DT). In addition, two alternative coarse-grained forms to represent residues were tested: using alpha carbon (CA) and side chain geometric center (GC). A database was built, comprising three top classes: all alpha, all beta, and alpha/beta. We found that the cutoff value? at about 7.0 A emerges as an important distance parameter.? Up to 7.0 A, CD and DT properties are unified, which implies that at this distance all contacts are complete and legitimate (not occluded). We also have shown that DT has an intrinsic missing edges problem when mapping the first layer of neighbors. In proteins, it may produce systematic errors affecting mainly the contact network in beta chains with CA. The almost-Delaunay (AD) approach has been proposed to solve this DT problem. We found that even AD may not be an advantageous solution. As a consequence, in the strict range up ? to 7.0 A, the CD approach revealed to be a simpler, more complete, and reliable technique than DT or AD. Finally, we have shown that coarse-grained residue representations may introduce bias in the analysis of neighbors in cutoffs up to ? 6.8 A, with CA favoring alpha proteins and GC favoring beta proteins. This provides an additional argument pointing to ? the value of 7.0 A as an important lower bound cutoff to be used in contact analysis of proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cultural heritage is constituted by complex and heterogenous materials, such as paintings but also ancient remains. However, all ancient materials are exposed to external environment and their interaction produces different changes due to chemical, physical and biological phenomena. The organic fraction, especially the proteinaceous one, has a crucial role in all these materials: in archaeology proteins reveal human habits, in artworks they disclose technics and help for a correct restoration. For these reasons the development of methods that allow the preservation of the sample as much as possible and a deeper knowledge of the deterioration processes is fundamental. The research activities presented in this PhD thesis have been focused on the development of new immunochemical and spectroscopic approaches in order to detect and identify organic substances in artistic and archaeological samples. Organic components could be present in different cultural heritage materials as constituent element (e.g., binders in paintings, collagen in bones) and their knowledge is fundamental for a complete understanding of past life, degradation processes and appropriate restauration approaches. The combination of immunological approach with a chemiluminescence detection and Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry allowed a sensitive and selective localization of collagen and elements in ancient bones and teeth. Near-infrared spectrometer and hyper spectral imaging have been applied in combination with chemometric data analysis as non-destructive methods for bones prescreening for the localization of collagen. Moreover, an investigation of amino acids in enamel has been proposed, in order to clarify teeth biomolecules survival overtime through the optimization and application of High-Performance Liquid Chromatography on modern and ancient enamel powder. New portable biosensors were developed for ovalbumin identification in paintings, thanks to the combination between biocompatible Gellan gel and electro-immunochemical sensors, to extract and identify painting binders with the contact only between gel and painting and between gel and electrodes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The discovery of new materials and their functions has always been a fundamental component of technological progress. Nowadays, the quest for new materials is stronger than ever: sustainability, medicine, robotics and electronics are all key assets which depend on the ability to create specifically tailored materials. However, designing materials with desired properties is a difficult task, and the complexity of the discipline makes it difficult to identify general criteria. While scientists developed a set of best practices (often based on experience and expertise), this is still a trial-and-error process. This becomes even more complex when dealing with advanced functional materials. Their properties depend on structural and morphological features, which in turn depend on fabrication procedures and environment, and subtle alterations leads to dramatically different results. Because of this, materials modeling and design is one of the most prolific research fields. Many techniques and instruments are continuously developed to enable new possibilities, both in the experimental and computational realms. Scientists strive to enforce cutting-edge technologies in order to make progress. However, the field is strongly affected by unorganized file management, proliferation of custom data formats and storage procedures, both in experimental and computational research. Results are difficult to find, interpret and re-use, and a huge amount of time is spent interpreting and re-organizing data. This also strongly limit the application of data-driven and machine learning techniques. This work introduces possible solutions to the problems described above. Specifically, it talks about developing features for specific classes of advanced materials and use them to train machine learning models and accelerate computational predictions for molecular compounds; developing method for organizing non homogeneous materials data; automate the process of using devices simulations to train machine learning models; dealing with scattered experimental data and use them to discover new patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esophageal ulcer (EU) represents an important comorbidity in AIDS. We evaluated the prevalence of EU, the accuracy of the endoscopic and histologic methods used to investigate viral EU in HIV-positive Brazilian patients and the numerical relevance of tissue sampling. A total of 399 HIV-positive patients underwent upper gastrointestinal (UGI) endoscopy. HIV-positive patients with EU determined by UGI endoscopy followed by biopsies were analyzed by the hematoxylin-eosin (HE) and immunohistochemical (IH) methods. EU was detected in 41 patients (mean age, 39.2 years; 23 males), with a prevalence of 10.27%. The median CD4 count was 49 cells/mm(3) (range, 1-361 cells/mm(3)) and the viral load was 58,869 copies per milliliter (range, 50-77,3290 copies per milliliter). UGI endoscopy detected 29 of 41 EU suggestive of cytomegalovirus (CMV) infection and 7 of 41 indicating herpes simplex virus (HSV) infection. HE histology confirmed 4 of 29 ulcers induced by CMV, 2 of 7 induced by HSV, and 1 of 7 induced by HSV plus CMV. IH for CMV and HSV confirmed the HE findings and detected one additional CMV-induced case. UGI endoscopy showed 100% sensitivity and 15% specificity for the diagnosis of EU due to CMV or HSV compared to HE and IH. HE proved to be an adequate method for etiologic evaluation, with 87% sensitivity and 100% specificity compared to IH. The number of samples did not influence the etiologic evaluation. The data support the importance of IH as a complementary method for HE in the diagnosis of EU of viral etiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Grass reference evapotranspiration (ETo) is an important agrometeorological parameter for climatological and hydrological studies, as well as for irrigation planning and management. There are several methods to estimate ETo, but their performance in different environments is diverse, since all of them have some empirical background. The FAO Penman-Monteith (FAD PM) method has been considered as a universal standard to estimate ETo for more than a decade. This method considers many parameters related to the evapotranspiration process: net radiation (Rn), air temperature (7), vapor pressure deficit (Delta e), and wind speed (U); and has presented very good results when compared to data from lysimeters Populated with short grass or alfalfa. In some conditions, the use of the FAO PM method is restricted by the lack of input variables. In these cases, when data are missing, the option is to calculate ETo by the FAD PM method using estimated input variables, as recommended by FAD Irrigation and Drainage Paper 56. Based on that, the objective of this study was to evaluate the performance of the FAO PM method to estimate ETo when Rn, Delta e, and U data are missing, in Southern Ontario, Canada. Other alternative methods were also tested for the region: Priestley-Taylor, Hargreaves, and Thornthwaite. Data from 12 locations across Southern Ontario, Canada, were used to compare ETo estimated by the FAD PM method with a complete data set and with missing data. The alternative ETo equations were also tested and calibrated for each location. When relative humidity (RH) and U data were missing, the FAD PM method was still a very good option for estimating ETo for Southern Ontario, with RMSE smaller than 0.53 mm day(-1). For these cases, U data were replaced by the normal values for the region and Delta e was estimated from temperature data. The Priestley-Taylor method was also a good option for estimating ETo when U and Delta e data were missing, mainly when calibrated locally (RMSE = 0.40 mm day(-1)). When Rn was missing, the FAD PM method was not good enough for estimating ETo, with RMSE increasing to 0.79 mm day(-1). When only T data were available, adjusted Hargreaves and modified Thornthwaite methods were better options to estimate ETo than the FAO) PM method, since RMSEs from these methods, respectively 0.79 and 0.83 mm day(-1), were significantly smaller than that obtained by FAO PM (RMSE = 1.12 mm day(-1). (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The vertebral spine angle in the frontal plane is an important parameter in the assessment of scoliosis and may be obtained from panoramic X-ray images. Technological advances have allowed for an increased use of digital X-ray images in clinical practice. PURPOSE: In this context, the objective of this study is to assess the reliability of computer-assisted Cobb angle measurements taken from digital X-ray images. STUDY DESIGN/SETTING: Clinical investigation quantifying scoliotic deformity with Cobb method to evaluate the intra- and interobserver variability using manual and digital techniques. PATIENT SAMPLE: Forty-nine patients diagnosed with idiopathic scoliosis were chosen based on convenience, without predilection for gender, age, type, location, or magnitude of the curvature. OUTCOME MEASURES: Images were examined to evaluate Cobb angle variability, end plate selection, as well as intra- and interobserver errors. METHODS: Specific software was developed to digitally reproduce the Cobb method and calculate semiautomatically the degree of scoliotic deformity. During the study, three observers estimated the Cobb angle using both the digital and the traditional manual methods. RESULTS: The results showed that Cobb angle measurements may be reproduced in the computer as reliably as with the traditional manual method, in similar conditions to those found in clinical practice. CONCLUSIONS: The computer-assisted method (digital method) is clinically advantageous and appropriate to assess the scoliotic curvature in the frontal plane using Cobb method. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Simple, rapid and stable sperm evaluation methods which have been optimized for common marmoset (Callithrix jacchus) are critical for studies involving collection and evaluation of sperm in the field. This is particularly important for new species groups such as Callitrichidae where the sperm have been little studied. Of this family, C jacchus is the best known, and has been chosen as a model species for other members of the genus Callithrix. The fundamental evaluation parameters for sperm of any species are viability and acrosomal status. Semen samples were collected by penile vibratory stimulation. To evaluate sperm plasma membrane integrity, Eosin-Nigrosin was tested here for the common marmoset sperm to be used under field conditions. Further, a non-fluorescent stain for acrosome, the ""Simple"" stain, developed for domestic and wild cats, was tested on common marmoset sperm. This was compared with a fluorescent staining, Fluorescein isothiocyanate-Pisum sativum agglutinin (FITC-PSA), routinely used and validated for common marmoset at the German Primate Centre to evaluate acrosomal integrity. Results obtained with the ""Simple"" stain showed a marked differentiation between sperm with intact and non-intact acrosome both with and without ionophore treatment and closely correlated with results obtained with FITC-PSA. Temperature had no effect on the results with the ""Simple"" stain and the complete processing is simple enough to be carried out under field conditions. These findings indicated that the ""Simple"" stain and Eosin-Nigrosin provide rapid and accurate results for C. jacchus sperm and that those methods can be reliably used as field tools for sperm evaluation for this species. (c) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we discuss implicit Taylor methods for stiff Ito stochastic differential equations. Based on the relationship between Ito stochastic integrals and backward stochastic integrals, we introduce three implicit Taylor methods: the implicit Euler-Taylor method with strong order 0.5, the implicit Milstein-Taylor method with strong order 1.0 and the implicit Taylor method with strong order 1.5. The mean-square stability properties of the implicit Euler-Taylor and Milstein-Taylor methods are much better than those of the corresponding semi-implicit Euler and Milstein methods and these two implicit methods can be used to solve stochastic differential equations which are stiff in both the deterministic and the stochastic components. Numerical results are reported to show the convergence properties and the stability properties of these three implicit Taylor methods. The stability analysis and numerical results show that the implicit Euler-Taylor and Milstein-Taylor methods are very promising methods for stiff stochastic differential equations.