70 resultados para Polynomial-time algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel technique for selecting the poles of orthonormal basis functions (OBF) in Volterra models of any order is presented. It is well-known that the usual large number of parameters required to describe the Volterra kernels can be significantly reduced by representing each kernel using an appropriate basis of orthonormal functions. Such a representation results in the so-called OBF Volterra model, which has a Wiener structure consisting of a linear dynamic generated by the orthonormal basis followed by a nonlinear static mapping given by the Volterra polynomial series. Aiming at optimizing the poles that fully parameterize the orthonormal bases, the exact gradients of the outputs of the orthonormal filters with respect to their poles are computed analytically by using a back-propagation-through-time technique. The expressions relative to the Kautz basis and to generalized orthonormal bases of functions (GOBF) are addressed; the ones related to the Laguerre basis follow straightforwardly as a particular case. The main innovation here is that the dynamic nature of the OBF filters is fully considered in the gradient computations. These gradients provide exact search directions for optimizing the poles of a given orthonormal basis. Such search directions can, in turn, be used as part of an optimization procedure to locate the minimum of a cost-function that takes into account the error of estimation of the system output. The Levenberg-Marquardt algorithm is adopted here as the optimization procedure. Unlike previous related work, the proposed approach relies solely on input-output data measured from the system to be modeled, i.e., no information about the Volterra kernels is required. Examples are presented to illustrate the application of this approach to the modeling of dynamic systems, including a real magnetic levitation system with nonlinear oscillatory behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A numerical algorithm for fully dynamical lubrication problems based on the Elrod-Adams formulation of the Reynolds equation with mass-conserving boundary conditions is described. A simple but effective relaxation scheme is used to update the solution maintaining the complementarity conditions on the variables that represent the pressure and fluid fraction. The equations of motion are discretized in time using Newmark`s scheme, and the dynamical variables are updated within the same relaxation process just mentioned. The good behavior of the proposed algorithm is illustrated in two examples: an oscillatory squeeze flow (for which the exact solution is available) and a dynamically loaded journal bearing. This article is accompanied by the ready-to-compile source code with the implementation of the proposed algorithm. [DOI: 10.1115/1.3142903]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amount of textual information digitally stored is growing every day. However, our capability of processing and analyzing that information is not growing at the same pace. To overcome this limitation, it is important to develop semiautomatic processes to extract relevant knowledge from textual information, such as the text mining process. One of the main and most expensive stages of the text mining process is the text pre-processing stage, where the unstructured text should be transformed to structured format such as an attribute-value table. The stemming process, i.e. linguistics normalization, is usually used to find the attributes of this table. However, the stemming process is strongly dependent on the language in which the original textual information is given. Furthermore, for most languages, the stemming algorithms proposed in the literature are computationally expensive. In this work, several improvements of the well know Porter stemming algorithm for the Portuguese language, which explore the characteristics of this language, are proposed. Experimental results show that the proposed algorithm executes in far less time without affecting the quality of the generated stems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2006 the Route load balancing algorithm was proposed and compared to other techniques aiming at optimizing the process allocation in grid environments. This algorithm schedules tasks of parallel applications considering computer neighborhoods (where the distance is defined by the network latency). Route presents good results for large environments, although there are cases where neighbors do not have an enough computational capacity nor communication system capable of serving the application. In those situations the Route migrates tasks until they stabilize in a grid area with enough resources. This migration may take long time what reduces the overall performance. In order to improve such stabilization time, this paper proposes RouteGA (Route with Genetic Algorithm support) which considers historical information on parallel application behavior and also the computer capacities and load to optimize the scheduling. This information is extracted by using monitors and summarized in a knowledge base used to quantify the occupation of tasks. Afterwards, such information is used to parameterize a genetic algorithm responsible for optimizing the task allocation. Results confirm that RouteGA outperforms the load balancing carried out by the original Route, which had previously outperformed others scheduling algorithms from literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil. This paper describes an algorithm, and a tool based on it, designed for the authoring and automatic checking of geometry exercises. The algorithm dynamically compares the distances between the geometric objects of the student`s solution and the template`s solution, provided by the author of the exercise. Each solution is a geometric construction which is considered a function receiving geometric objects (input) and returning other geometric objects (output). Thus, for a given problem, if we know one function (construction) that solves the problem, we can compare it to any other function to check whether they are equivalent or not. Two functions are equivalent if, and only if, they have the same output when the same input is applied. If the student`s solution is equivalent to the template`s solution, then we consider the student`s solution as a correct solution. Our software utility provides both authoring and checking tools to work directly on the Internet, together with learning management systems. These tools are implemented using the dynamic geometry software, iGeom, which has been used in a geometry course since 2004 and has a successful track record in the classroom. Empowered with these new features, iGeom simplifies teachers` tasks, solves non-trivial problems in student solutions and helps to increase student motivation by providing feedback in real time. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given two strings A and B of lengths n(a) and n(b), n(a) <= n(b), respectively, the all-substrings longest common subsequence (ALCS) problem obtains, for every substring B` of B, the length of the longest string that is a subsequence of both A and B. The ALCS problem has many applications, such as finding approximate tandem repeats in strings, solving the circular alignment of two strings and finding the alignment of one string with several others that have a common substring. We present an algorithm to prepare the basic data structure for ALCS queries that takes O(n(a)n(b)) time and O(n(a) + n(b)) space. After this preparation, it is possible to build that allows any LCS length to be retrieved in constant time. Some trade-offs between the space required and a matrix of size O(n(b)(2)) the querying time are discussed. To our knowledge, this is the first algorithm in the literature for the ALCS problem. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors` recent classification of trilinear operations includes, among other cases, a fourth family of operations with parameter q epsilon Q boolean OR {infinity}, and weakly commutative and weakly anticommutative operations. These operations satisfy polynomial identities in degree 3 and further identities in degree 5. For each operation, using the row canonical form of the expansion matrix E to find the identities in degree 5 gives extremely complicated results. We use lattice basis reduction to simplify these identities: we compute the Hermite normal form H of E(t), obtain a basis of the nullspace lattice from the last rows of a matrix U for which UE(t) = H, and then use the LLL algorithm to reduce the basis. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We simplify the results of Bremner and Hentzel [J. Algebra 231 (2000) 387-405] on polynomial identities of degree 9 in two variables satisfied by the ternary cyclic sum [a, b, c] abc + bca + cab in every totally associative ternary algebra. We also obtain new identities of degree 9 in three variables which do not follow from the identities in two variables. Our results depend on (i) the LLL algorithm for lattice basis reduction, and (ii) linearization operators in the group algebra of the symmetric group which permit efficient computation of the representation matrices for a non-linear identity. Our computational methods can be applied to polynomial identities for other algebraic structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigated the effects of the cement type and the water storage time on the push-out bond strength of a glass fiber post. Glass fiber posts (Fibrekor, Jeneric Pentron) were luted to post spaces using a self-cured resin cement (C&B Cement [CB]), a glass ionomer cement (Ketac Cem [KC]) or a resin-modified glass ionomer cement (GC FujiCEM [FC]) according to the manufacturers’ instructions. For each luting agent, the specimens were exposed to one of the following water storage times (n=5): 1 day (T1), 7 days (T7), 90 days (T90) and 180 days (T180). Push-out tests were performed after the storage times. Control specimens were not exposed to water storage, but subjected to the push-out test 10 min after post cementation. Data (in MPa) were analyzed by Kruskal-Wallis and Dunn`s test (α=0.05). Cement type and water storage time had a significant effect (p<0.05) on the push-out bond strength. CB showed significantly higher values of retention (p<0.05) than KC and FC, irrespective of the water storage time. Water storage increased significantly the push-out bond strength in T7 and T90, regardless of the cement type (p<0.05). The results showed that fiber posts luted to post spaces with the self-cured resin cement exhibited the best bonding performance throughout the 180-day water storage period. All cements exhibited a tendency to increase the bond strength after 7 and 90 days of water storage, decreasing thereafter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This in vitro study evaluated the cytotoxicity of an experimental restorative composite resin subjected to different light-curing regimens. METHODS: Forty round-shaped specimens were prepared and randomly assigned to four experimental groups (n=10), as follows: in Group 1, no light-curing; in Groups 2, 3 and 4, the composite resin specimens were light-cured for 20, 40 or 60 s, respectively. In Group 5, filter paper discs soaked in 5 µL PBS were used as negative controls. The resin specimens and paper discs were placed in wells of 24-well plates in which the odontoblast-like cells MDPC-23 (30,000 cells/cm²) were plated and incubated in a humidified incubator with 5% CO2 and 95% air at 37ºC for 72 h. The cytotoxicity was evaluated by the cell metabolism (MTT assay) and cell morphology (SEM). The data were analyzed statistically by Kruskal-Wallis and Mann-Whitney tests (p<0.05). RESULTS: In G1, cell metabolism decreased by 86.2%, indicating a severe cytotoxicity of the non-light-cured composite resin. On the other hand, cell metabolism decreased by only 13.3% and 13.5% in G2 and G3, respectively. No cytotoxic effects were observed in G4 and G5. In G1, only a few round-shaped cells with short processes on their cytoplasmic membrane were observed. In the other experimental groups as well as in control group, a number of spindle-shaped cells with long cytoplasmic processes were found. CONCLUSION: Regardless of the photoactivation time used in the present investigation, the experimental composite resin presented mild to no toxic effects to the odontoblast-like MDPC-23 cells. However, intense cytotoxic effects occurred when no light-curing was performed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the influence of a cola-type soft drink and a soy-based orange juice on the surface and subsurface erosion of primary enamel, as a function of the exposure time. Seventy-five primary incisors were divided for microhardness test (n=45) or scanning electron microscopy (SEM) analysis (n=30). The specimens were randomly assigned to 3 groups: 1 - artificial saliva (control); 2 - cola-type soft drink; and 3 - soy-based orange juice. Immersion cycles in the beverages were undertaken under agitation for 5 min, 3 times a day, during 60 days. Surface microhardness was measured at 7, 15, 30, 45 and 60 days. After 60 days, specimens were bisected and subsurface microhardness was measured at 30, 60, 90, 120, 150 and 200 µm from the surface exposed. Data were analyzed by ANOVA and Tukey’s test (a=0.05). Groups 2 and 3 presented similar decrease of surface microhardness. Regarding subsurface microhardness, group 2 presented the lowest values. SEM images revealed that after 60 days the surfaces clearly exhibited structural loss, unlike those immersed in artificial saliva. It may be concluded that erosion of the surfaces exposed to the cola-type soft drink was more accentuated and directly proportional to the exposure time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the effect of surface sealant on the translucency of composite resin immersed in different solutions. The study involved the following materials: Charisma, Fortify and coffee, Coca-Cola®, tea and artificial saliva as solutions. Sixty-four specimens (n = 8) were manufactured and immersed in artificial saliva at 37 ± 1 °C. Samples were immersed in the solutions for three times a day and re-immersed in artificial saliva until the translucency readings. The measurements were carried out at nine times: T1 - 24 hours after specimen preparation, T2 - 24 hours after immersion in the solutions, T3 - 48 hours and T4 to T9 - 7, 14, 21, 30, 60 and 90 days, respectively, after immersion. The translucency values were measured using a JOUAN device. The results were subjected to ANOVA and Tukey's test at 5%. The surface sealant was not able to protect the composite resin against staining, the coffee showed the strongest staining action, followed by tea and regarding immersion time, a significant alteration was noted in the translucency of composite resin after 21 days.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate the quality of filling in main and lateral root canals performed with the McSpadden technique, regarding the time spent on the procedure and the type of gutta-percha employed. Fifty simulated root canals, made with six lateral canals placed two apiece in the cervical, middle and apical thirds of the root, were divided into 5 groups. Group A: McSpadden technique with conventional gutta-percha, performed with sufficient time for canal filling; Group B: McSpadden technique with conventional gutta-percha, performed in twice the mean time used in Group A; Group C: McSpadden technique with TP gutta-percha, performed with sufficient time for canal filling; Group D: McSpadden technique with TP gutta-percha, performed in twice the mean time used in Group C; Group E: lateral condensation technique. Images of the filled root canals were taken using a stereomicroscope and analyzed using the Leica QWIN Pro software for filling material flow, gutta-percha filling extension and sealer flow. Data were analyzed by analysis of variance (ANOVA) and Tukey test (p < 0.05). The best values of penetration in lateral canals in the middle third occurred in the groups where TP gutta-percha was used. However, in the apical third, group B showed the best values. Although a longer time of compactor use allows greater penetration of the filling material into the lateral canals, the presence of voids resulted in bad quality radiographic images, suggesting porosity. The best quality of filling material was observed in Group A (McSpadden technique with conventional Gutta-Percha, performed with sufficient time for root canal filling).