947 resultados para Bio-inspired optimization techniques
Resumo:
In tissue engineering of cartilage, polymeric scaffolds are implanted in the damaged tissue and subjected to repeated compression loading cycles. The possibility of failure due to mechanical fatigue has not been properly addressed in these scaffolds. Nevertheless, the macroporous scaffold is susceptible to failure after repeated loading-unloading cycles. This is related to inherent discontinuities in the material due to the micropore structure of the macro-pore walls that act as stress concentration points. In this work, chondrogenic precursor cells have been seeded in Poly-ε-caprolactone (PCL) scaffolds with fibrin and some were submitted to free swelling culture and others to cyclic loading in a bioreactor. After cell culture, all the samples were analyzed for fatigue behavior under repeated loading-unloading cycles. Moreover, some components of the extracellular matrix (ECM) were identified. No differences were observed between samples undergoing free swelling or bioreactor loading conditions, neither respect to matrix components nor to mechanical performance to fatigue. The ECM did not achieve the desired preponderance of collagen type II over collagen type I which is considered the main characteristic of hyaline cartilage ECM. However, prediction in PCL with ECM constructs was possible up to 600 cycles, an enhanced performance when compared to previous works. PCL after cell culture presents an improved fatigue resistance, despite the fact that the measured elastic modulus at the first cycle was similar to PCL with poly(vinyl alcohol) samples. This finding suggests that fatigue analysis in tissue engineering constructs can provide additional information missed with traditional mechanical measurements.
Resumo:
Polymeric scaffolds used in regenerative therapies are implanted in the damaged tissue and subjected to repeated loading cycles. In the case of articular cartilage engineering, an implanted scaffold is typically subjected to long term dynamic compression. The evolution of the mechanical properties of the scaffold during bioresorption has been deeply studied in the past, but the possibility of failure due to mechanical fatigue has not been properly addressed. Nevertheless, the macroporous scaffold is susceptible to failure after repeated loading-unloading cycles. In this work fatigue studies of polycaprolactone scaffolds were carried by subjecting the scaffold to repeated compression cycles in conditions simulating the scaffold implanted in the articular cartilage. The behaviour of the polycaprolactone sponge with the pores filled with a poly(vinyl alcohol) gel simulating the new formed tissue within the pores was compared with that of the material immersed in water. Results were analyzed with Morrow’s criteria for failure and accurate fittings are obtained just up to 200 loading cycles. It is also shown that the presence of poly(vinyl alcohol) increases the elastic modulus of the scaffolds, the effect being more pronounced with increasing the number of freeze/thawing cycles.
Resumo:
Electrospun poly(vinylidene fluoride) (PVDF) fiber mats find applications in an increasing number of areas, such as battery separators, filtration and detection membranes, due to their excellent properties. However, there are limitations due to the hydrophobic nature and low surface energy of PVDF. In this work, oxygen plasma treatment has been applied in order to modify the surface wettability of PVDF fiber mats and superhydrophilic PVDF electrospun membranes have been obtained. Further, plasma treatment does not significantly influences fiber average size (~400 ± 200 nm), morphology, electroactive -phase content (~80-85%) or the degree of crystallinity (Xc of 42 ± 2%), allowing to maintain the excellent physical-chemical characteristics of PVDF. Plasma treatment mainly induces surface chemistry modifications, such as the introduction of oxygen and release of fluorine atoms that significantly changes polymer membrane wettability by a reduction of the contact angle of the polymer fibers and an overall decrease of the surface tension of the membranes.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Bacterial cellulose (BC) films from two distinct sources (obtained by static culture with Gluconacetobacter xylinus ATCC 53582 (BC1) and from a commercial source (BC2)) were modified by bovine lactoferrin (bLF) adsorption. The functionalized films (BC+bLF) were assessed as edible antimicrobial packaging, for use in direct contact with highly perishable foods, specifically fresh sausage as a model of meat products. BC+bLF films and sausage casings were characterized regarding their water vapour permeability (WVP), mechanical properties, and bactericidal efficiency against two food pathogens, Escherichia coli and Staphylococcus aureus. Considering their edibility, an in vitro gastrointestinal tract model was used to study the changes occurring in the BC films during passage through the gastrointestinal tract. Moreover, the cytotoxicity of the BC films against 3T3 mouse embryo fibroblasts was evaluated. BC1 and BC2 showed equivalent density, WVP and maximum tensile strength. The percentage of bactericidal efficiency of BC1 and BC2 with adsorbed bLF (BC1+bLF and BC2+bLF, respectively) in the standalone films and in inoculated fresh sausages, was similar against E. coli (mean reduction 69 % in the films per se versus 94 % in the sausages) and S. aureus (mean reduction 97 % in the films per se versus 36 % in the case sausages). Moreover, the BC1+bLF and BC2+bLF films significantly hindered the specific growth rate of both bacteria. Finally, no relevant cytotoxicity against 3T3 fibroblasts was found for the films before and after the simulated digestion. BC films with adsorbed bLF may constitute an approach in the development of bio-based edible antimicrobial packaging systems.
Resumo:
Les xarxes híbrides satèl·lit-terrestre ofereixen connectivitat a zones remotes i aïllades i permeten resoldre nombrosos problemes de comunicacions. No obstant, presenten diversos reptes, ja que realitzen la comunicació per un canal mòbil terrestre i un canal satèl·lit contigu. Un d'aquests reptes és trobar mecanismes per realitzar eficientment l'enrutament i el control de flux, de manera conjunta. L'objectiu d'aquest projecte és simular i estudiar algorismes existents que resolguin aquests problemes, així com proposar-ne de nous, mitjançant diverses tècniques d'optimització convexa. A partir de les simulacions realitzades en aquest estudi, s'han analitzat àmpliament els diversos problemes d'enrutament i control de flux, i s'han avaluat els resultats obtinguts i les prestacions dels algorismes emprats. En concret, s'han implementat de manera satisfactòria algorismes basats en el mètode de descomposició dual, el mètode de subgradient, el mètode de Newton i el mètode de la barrera logarítmica, entre d'altres, per tal de resoldre els problemes d'enrutament i control de flux plantejats.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum Capture Problem. In the original MCP, market capture is obtained by lower traveling distances or lower traveling time, in this new version not only the traveling time but also the waiting time will affect the market share. This problem is hard to solve using standard optimization techniques. Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
In this paper we propose a metaheuristic to solve a new version of the Maximum CaptureProblem. In the original MCP, market capture is obtained by lower traveling distances or lowertraveling time, in this new version not only the traveling time but also the waiting time willaffect the market share. This problem is hard to solve using standard optimization techniques.Metaheuristics are shown to offer accurate results within acceptable computing times.
Resumo:
The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.
Resumo:
Geophysical techniques can help to bridge the inherent gap with regard to spatial resolution and the range of coverage that plagues classical hydrological methods. This has lead to the emergence of the new and rapidly growing field of hydrogeophysics. Given the differing sensitivities of various geophysical techniques to hydrologically relevant parameters and their inherent trade-off between resolution and range the fundamental usefulness of multi-method hydrogeophysical surveys for reducing uncertainties in data analysis and interpretation is widely accepted. A major challenge arising from such endeavors is the quantitative integration of the resulting vast and diverse database in order to obtain a unified model of the probed subsurface region that is internally consistent with all available data. To address this problem, we have developed a strategy towards hydrogeophysical data integration based on Monte-Carlo-type conditional stochastic simulation that we consider to be particularly suitable for local-scale studies characterized by high-resolution and high-quality datasets. Monte-Carlo-based optimization techniques are flexible and versatile, allow for accounting for a wide variety of data and constraints of differing resolution and hardness and thus have the potential of providing, in a geostatistical sense, highly detailed and realistic models of the pertinent target parameter distributions. Compared to more conventional approaches of this kind, our approach provides significant advancements in the way that the larger-scale deterministic information resolved by the hydrogeophysical data can be accounted for, which represents an inherently problematic, and as of yet unresolved, aspect of Monte-Carlo-type conditional simulation techniques. We present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on pertinent synthetic data and then applied to corresponding field data collected at the Boise Hydrogeophysical Research Site near Boise, Idaho, USA.
Resumo:
In this work we present a simulation of a recognition process with perimeter characterization of a simple plant leaves as a unique discriminating parameter. Data coding allowing for independence of leaves size and orientation may penalize performance recognition for some varieties. Border description sequences are then used to characterize the leaves. Independent Component Analysis (ICA) is then applied in order to study which is the best number of components to be considered for the classification task, implemented by means of an Artificial Neural Network (ANN). Obtained results with ICA as a pre-processing tool are satisfactory, and compared with some references our system improves the recognition success up to 80.8% depending on the number of considered independent components.
Resumo:
In this paper, we present a comprehensive study of different Independent Component Analysis (ICA) algorithms for the calculation of coherency and sharpness of electroencephalogram (EEG) signals, in order to investigate the possibility of early detection of Alzheimer’s disease (AD). We found that ICA algorithms can help in the artifact rejection and noise reduction, improving the discriminative property of features in high frequency bands (specially in high alpha and beta ranges). In addition to different ICA algorithms, the optimum number of selected components is investigated, in order to help decision processes for future works.
Resumo:
In this paper we present a quantitative comparisons of different independent component analysis (ICA) algorithms in order to investigate their potential use in preprocessing (such as noise reduction and feature extraction) the electroencephalogram (EEG) data for early detection of Alzhemier disease (AD) or discrimination between AD (or mild cognitive impairment, MCI) and age-match control subjects.
Resumo:
Using combined emotional stimuli, combining photos of faces and recording of voices, we investigated the neural dynamics of emotional judgment using scalp EEG recordings. Stimuli could be either combioned in a congruent, or a non-congruent way.. As many evidences show the major role of alpha in emotional processing, the alpha band was subjected to be analyzed. Analysis was performed by computing the synchronization of the EEGs and the conditions congruent vs. non-congruent were compared using statistical tools. The obtained results demonstrate that scalp EEG ccould be used as a tool to investigate the neural dynamics of emotional valence and discriminate various emotions (angry, happy and neutral stimuli).
Resumo:
In this work we propose a method to quantify written signatures from digitalized images based on the use of Elliptical Fourier Descriptors (EFD). As usually signatures are not represented as a closed contour, and being that a necessary condition in order to apply EFD, we have developed a method that represents the signatures by means of a set of closed contours. One of the advantages of this method is that it can reconstruct the original shape from all the coefficients, or an approximated shape from a reduced set of them finding the appropriate number of EFD coefficients required for preserving the important information in each application. EFD provides accurate frequency information, thus the use of EFD opens many possibilities. The method can be extended to represent other kind of shapes.