906 resultados para Normalization-based optimization
Resumo:
Ecological risk assessment (ERA) is a framework for monitoring risks of exposure and adverse effects of environmental stressors to populations or communities of interest. One tool of ERA is the biomarker, which is a characteristic of an organism that reliably indicates exposure to or effects of a stressor like chemical pollution. Traditional biomarkers which rely on characteristics at the tissue level and higher often detect only acute exposures to stressors. Sensitive molecular biomarkers may detect lower stressor levels than traditional biomarkers, which helps inform risk mitigation and restoration efforts before populations and communities are irreversibly affected. In this study I developed gene expression-based molecular biomarkers of exposure to metals and insecticides in the model toxicological freshwater amphipod Hyalella azteca. My goals were to not only create sensitive molecular biomarkers for these chemicals, but also to show the utility and versatility of H. azteca in molecular studies for toxicology and risk assessment. I sequenced and assembled the H. azteca transcriptome to identify reference and stress-response gene transcripts suitable for expression monitoring. I exposed H. azteca to sub-lethal concentrations of metals (cadmium and copper) and insecticides (DDT, permethrin, and imidacloprid). Reference genes used to create normalization factors were determined for each exposure using the programs BestKeeper, GeNorm, and NormFinder. Both metals increased expression of a nuclear transcription factor (Cnc), an ABC transporter (Mrp4), and a heat shock protein (Hsp90), giving evidence of general metal exposure signature. Cadmium uniquely increased expression of a DNA repair protein (Rad51) and increased Mrp4 expression more than copper (7-fold increase compared to 2-fold increase). Together these may be unique biomarkers distinguishing cadmium and copper exposures. DDT increased expression of Hsp90, Mrp4, and the immune response gene Lgbp. Permethrin increased expression of a cytochrome P450 (Cyp2j2) and decreased expression of the immune response gene Lectin-1. Imidacloprid did not affect gene expression. Unique biomarkers were seen for DDT and permethrin, but the genes studied were not sensitive enough to detect imidacloprid at the levels used here. I demonstrated that gene expression in H. azteca detects specific chemical exposures at sub-lethal concentrations, making expression monitoring using this amphipod a useful and sensitive biomarker for risk assessment of chemical exposure.
Resumo:
Considerable interest in renewable energy has increased in recent years due to the concerns raised over the environmental impact of conventional energy sources and their price volatility. In particular, wind power has enjoyed a dramatic global growth in installed capacity over the past few decades. Nowadays, the advancement of wind turbine industry represents a challenge for several engineering areas, including materials science, computer science, aerodynamics, analytical design and analysis methods, testing and monitoring, and power electronics. In particular, the technological improvement of wind turbines is currently tied to the use of advanced design methodologies, allowing the designers to develop new and more efficient design concepts. Integrating mathematical optimization techniques into the multidisciplinary design of wind turbines constitutes a promising way to enhance the profitability of these devices. In the literature, wind turbine design optimization is typically performed deterministically. Deterministic optimizations do not consider any degree of randomness affecting the inputs of the system under consideration, and result, therefore, in an unique set of outputs. However, given the stochastic nature of the wind and the uncertainties associated, for instance, with wind turbine operating conditions or geometric tolerances, deterministically optimized designs may be inefficient. Therefore, one of the ways to further improve the design of modern wind turbines is to take into account the aforementioned sources of uncertainty in the optimization process, achieving robust configurations with minimal performance sensitivity to factors causing variability. The research work presented in this thesis deals with the development of a novel integrated multidisciplinary design framework for the robust aeroservoelastic design optimization of multi-megawatt horizontal axis wind turbine (HAWT) rotors, accounting for the stochastic variability related to the input variables. The design system is based on a multidisciplinary analysis module integrating several simulations tools needed to characterize the aeroservoelastic behavior of wind turbines, and determine their economical performance by means of the levelized cost of energy (LCOE). The reported design framework is portable and modular in that any of its analysis modules can be replaced with counterparts of user-selected fidelity. The presented technology is applied to the design of a 5-MW HAWT rotor to be used at sites of wind power density class from 3 to 7, where the mean wind speed at 50 m above the ground ranges from 6.4 to 11.9 m/s. Assuming the mean wind speed to vary stochastically in such range, the rotor design is optimized by minimizing the mean and standard deviation of the LCOE. Airfoil shapes, spanwise distributions of blade chord and twist, internal structural layup and rotor speed are optimized concurrently, subject to an extensive set of structural and aeroelastic constraints. The effectiveness of the multidisciplinary and robust design framework is demonstrated by showing that the probabilistically designed turbine achieves more favorable probabilistic performance than those of the initial baseline turbine and a turbine designed deterministically.
Resumo:
In this work three different metallic metamaterials (MMs) structures such as asymmetric split ring resonators (A-SRRs), dipole and split H-shaped (ASHs) structures that support plasmonic resonances have been developed. The aim of the work involves the optimization of photonic sensor based on plasmonic resonances and surface enhanced infrared absorption (SEIRA) from the MM structures. The MMs structures were designed to tune their plasmonic resonance peaks in the mid-infrared region. The plasmonic resonance peaks produced are highly dependent on the structural dimension and polarisation of the electromagnetic (EM) source. The ASH structure particularly has the ability to produce the plasmonic resonance peak with dual polarisation of the EM source. The double resonance peaks produced due to the asymmetric nature of the structures were optimized by varying the fundamental parameters of the design. These peaks occur due to hybridization of the individual elements of the MMs structure. The presence of a dip known as a trapped mode in between the double plasmonic peaks helps to narrow the resonances. A periodicity greater than twice the length and diameter of the metallic structure was applied to produce narrow resonances for the designed MMs. A nanoscale gap in each structure that broadens the trapped mode to narrow the plasmonic resonances was also used. A thickness of 100 nm gold was used to experimentally produce a high quality factor of 18 in the mid-infrared region. The optimised plasmonic resonance peaks was used for detection of an analyte, 17β-estradiol. 17β-estradiol is mostly responsible for the development of human sex organs and can be found naturally in the environment through human excreta. SEIRA was the method applied to the analysis of the analyte. The work is important in the monitoring of human biology and in water treatment. Applying this method to the developed nano-engineered structures, enhancement factors of 10^5 and a sensitivity of 2791 nm/RIU was obtained. With this high sensitivity a figure of merit (FOM) of 9 was also achieved from the sensors. The experiments were verified using numerical simulations where the vibrational resonances of the C-H stretch from 17β-estradiol were modelled. Lastly, A-SRRs and ASH on waveguides were also designed and evaluated. These patterns are to be use as basis for future work.
Resumo:
Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurse’s assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.
Resumo:
Over the last decade, success of social networks has significantly reshaped how people consume information. Recommendation of contents based on user profiles is well-received. However, as users become dominantly mobile, little is done to consider the impacts of the wireless environment, especially the capacity constraints and changing channel. In this dissertation, we investigate a centralized wireless content delivery system, aiming to optimize overall user experience given the capacity constraints of the wireless networks, by deciding what contents to deliver, when and how. We propose a scheduling framework that incorporates content-based reward and deliverability. Our approach utilizes the broadcast nature of wireless communication and social nature of content, by multicasting and precaching. Results indicate this novel joint optimization approach outperforms existing layered systems that separate recommendation and delivery, especially when the wireless network is operating at maximum capacity. Utilizing limited number of transmission modes, we significantly reduce the complexity of the optimization. We also introduce the design of a hybrid system to handle transmissions for both system recommended contents ('push') and active user requests ('pull'). Further, we extend the joint optimization framework to the wireless infrastructure with multiple base stations. The problem becomes much harder in that there are many more system configurations, including but not limited to power allocation and how resources are shared among the base stations ('out-of-band' in which base stations transmit with dedicated spectrum resources, thus no interference; and 'in-band' in which they share the spectrum and need to mitigate interference). We propose a scalable two-phase scheduling framework: 1) each base station obtains delivery decisions and resource allocation individually; 2) the system consolidates the decisions and allocations, reducing redundant transmissions. Additionally, if the social network applications could provide the predictions of how the social contents disseminate, the wireless networks could schedule the transmissions accordingly and significantly improve the dissemination performance by reducing the delivery delay. We propose a novel method utilizing: 1) hybrid systems to handle active disseminating requests; and 2) predictions of dissemination dynamics from the social network applications. This method could mitigate the performance degradation for content dissemination due to wireless delivery delay. Results indicate that our proposed system design is both efficient and easy to implement.
Resumo:
Abstract. Two ideas taken from Bayesian optimization and classifier systems are presented for personnel scheduling based on choosing a suitable scheduling rule from a set for each person's assignment. Unlike our previous work of using genetic algorithms whose learning is implicit, the learning in both approaches is explicit, i.e. we are able to identify building blocks directly. To achieve this target, the Bayesian optimization algorithm builds a Bayesian network of the joint probability distribution of the rules used to construct solutions, while the adapted classifier system assigns each rule a strength value that is constantly updated according to its usefulness in the current situation. Computational results from 52 real data instances of nurse scheduling demonstrate the success of both approaches. It is also suggested that the learning mechanism in the proposed approaches might be suitable for other scheduling problems.
Resumo:
Today , Providing drinking water and process water is one of the major problems in most countries ; the surface water often need to be treated to achieve necessary quality, and in this way, technological and also financial difficulties cause great restrictions in operating the treatment units. Although water supply by simple and cheap systems has been one of the important objectives in most scientific and research centers in the world, still a great percent of population in developing countries, especially in rural areas, don't benefit well quality water. One of the big and available sources for providing acceptable water is sea water. There are two ways to treat sea water first evaporation and second reverse osmosis system. Nowadays R.O system has been used for desalination because of low budget price and easily to operate and maintenance. The sea water should be pretreated before R.O plants, because there is some difficulties in raw sea water that can decrease yield point of membranes in R.O system. The subject of this research may be useful in this way, and we hope to be able to achieve complete success in design and construction of useful pretreatment systems for R.O plant. One of the most important units in the sea water pretreatment plant is filtration, the conventional method for filtration is pressurized sand filters, and the subject of this research is about new filtration which is called continuous back wash sand filtration (CBWSF). The CBWSF designed and tested in this research may be used more economically with less difficulty. It consists two main parts first shell body and second central part comprising of airlift pump, raw water feeding pipe, air supply hose, backwash chamber and sand washer as well as inlet and outlet connections. The CBWSF is a continuously operating filter, i.e. the filter does not have to be taken out of operation for backwashing or cleaning. Inlet water is fed through the sand bed while the sand bed is moving downwards. The water gets filtered while the sand becomes dirty. Simultaneously, the dirty sand is cleaned in the sand washer and the suspended solids are discharged in backwash water. We analyze the behavior of CBWSF in pretreatment of sea water instead of pressurized sand filter. There is one important factor which is not suitable for R.O membranes, it is bio-fouling. This factor is defined by Silt Density Index (SDI).measured by SDI. In this research has been focused on decreasing of SDI and NTU. Based on this goal, the prototype of pretreatment had been designed and manufactured to test. The system design was done mainly by using the design fundamentals of CBWSF. The automatic backwash sand filter can be used in small and also big water supply schemes. In big water treatment plants, the units of filters perform the filtration and backwash stages separately, and in small treatment plants, the unit is usually compacted to achieve less energy consumption. The analysis of the system showed that it may be used feasibly for water treating, especially for limited population. The construction is rapid, simple and economic, and its performance is high enough because no mobile mechanical part is used in it, so it may be proposed as an effective method to improve the water quality and consequently the hygiene level in the remote places of the country.
Resumo:
Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.
Resumo:
Tomato (Lycopersicon esculentum Mill.) is the second most important vegetable crop worldwide and a rich source of hydrophilic (H) and lipophilic (L) antioxidants. The H fraction is constituted mainly by ascorbic acid and soluble phenolic compounds, while the L fraction contains carotenoids (mostly lycopene), tocopherols, sterols and lipophilic phenolics [1,2]. To obtain these antioxidants it is necessary to follow appropriate extraction methods and processing conditions. In this regard, this study aimed at determining the optimal extraction conditions for H and L antioxidants from a tomato surplus. A 5-level full factorial design with 4 factors (extraction time (I, 0-20 min), temperature (T, 60-180 •c), ethanol percentage (Et, 0-100%) and solid/liquid ratio (S/L, 5-45 g!L)) was implemented and the response surface methodology used for analysis. Extractions were carried out in a Biotage Initiator Microwave apparatus. The concentration-time response methods of crocin and P-carotene bleaching were applied (using 96-well microplates), since they are suitable in vitro assays to evaluate the antioxidant activity of H and L matrices, respectively [3]. Measurements were carried out at intervals of 3, 5 and 10 min (initiation, propagation and asymptotic phases), during a time frame of 200 min. The parameters Pm (maximum protected substrate) and V m (amount of protected substrate per g of extract) and the so called IC50 were used to quantify the response. The optimum extraction conditions were as follows: r~2.25 min, 7'=149.2 •c, Et=99.1 %and SIL=l5.0 giL for H antioxidants; and t=l5.4 min, 7'=60.0 •c, Et=33.0% and S/L~l5.0 g/L for L antioxidants. The proposed model was validated based on the high values of the adjusted coefficient of determination (R2.wi>0.91) and on the non-siguificant differences between predicted and experimental values. It was also found that the antioxidant capacity of the H fraction was much higher than the L one.
Resumo:
Mass spectrometry (MS)-based proteomics has seen significant technical advances during the past two decades and mass spectrometry has become a central tool in many biosciences. Despite the popularity of MS-based methods, the handling of the systematic non-biological variation in the data remains a common problem. This biasing variation can result from several sources ranging from sample handling to differences caused by the instrumentation. Normalization is the procedure which aims to account for this biasing variation and make samples comparable. Many normalization methods commonly used in proteomics have been adapted from the DNA-microarray world. Studies comparing normalization methods with proteomics data sets using some variability measures exist. However, a more thorough comparison looking at the quantitative and qualitative differences of the performance of the different normalization methods and at their ability in preserving the true differential expression signal of proteins, is lacking. In this thesis, several popular and widely used normalization methods (the Linear regression normalization, Local regression normalization, Variance stabilizing normalization, Quantile-normalization, Median central tendency normalization and also variants of some of the forementioned methods), representing different strategies in normalization are being compared and evaluated with a benchmark spike-in proteomics data set. The normalization methods are evaluated in several ways. The performance of the normalization methods is evaluated qualitatively and quantitatively on a global scale and in pairwise comparisons of sample groups. In addition, it is investigated, whether performing the normalization globally on the whole data or pairwise for the comparison pairs examined, affects the performance of the normalization method in normalizing the data and preserving the true differential expression signal. In this thesis, both major and minor differences in the performance of the different normalization methods were found. Also, the way in which the normalization was performed (global normalization of the whole data or pairwise normalization of the comparison pair) affected the performance of some of the methods in pairwise comparisons. Differences among variants of the same methods were also observed.
Resumo:
Tomato (Lycopersicon esculentum Mill.), apart from being a functional food rich in carotenoids, vitamins and minerals, is also an important source of phenolic compounds [1 ,2]. As antioxidants, these functional molecules play an important role in the prevention of human pathologies and have many applications in nutraceutical, pharmaceutical and cosmeceutical industries. Therefore, the recovery of added-value phenolic compounds from natural sources, such as tomato surplus or industrial by-products, is highly desirable. Herein, the microwave-assisted extraction of the main phenolic acids and flavonoids from tomato was optimized. A S-Ieve! full factorial Box-Behnken design was implemented and response surface methodology used for analysis. The extraction time (0-20 min), temperature (60-180 "C), ethanol percentage (0-100%), solidlliquid ratio (5-45 g/L) and microwave power (0-400 W) were studied as independent variables. The phenolic profile of the studied tomato variety was initially characterized by HPLC-DAD-ESIIMS [2]. Then, the effect of the different extraction conditions, as defined by the used experimental design, on the target compounds was monitored by HPLC-DAD, using their UV spectra and retention time for identification and a series of calibrations based on external standards for quantification. The proposed model was successfully implemented and statistically validated. The microwave power had no effect on the extraction process. Comparing with the optimal extraction conditions for flavonoids, which demanded a short processing time (2 min), a low temperature (60 "C) and solidlliquid ratio (5 g/L), and pure ethanol, phenolic acids required a longer processing time ( 4.38 min), a higher temperature (145.6 •c) and solidlliquid ratio (45 g/L), and water as extraction solvent. Additionally, the studied tomato variety was highlighted as a source of added-value phenolic acids and flavonoids.
Resumo:
The majority of research work carried out in the field of Operations-Research uses methods and algorithms to optimize the pick-up and delivery problem. Most studies aim to solve the vehicle routing problem, to accommodate optimum delivery orders, vehicles etc. This paper focuses on green logistics approach, where existing Public Transport infrastructure capability of a city is used for the delivery of small and medium sized packaged goods thus, helping improve the situation of urban congestion and greenhouse gas emissions reduction. It carried out a study to investigate the feasibility of the proposed multi-agent based simulation model, for efficiency of cost, time and energy consumption. Multimodal Dijkstra Shortest Path algorithm and Nested Monte Carlo Search have been employed for a two-phase algorithmic approach used for generation of time based cost matrix. The quality of the tour is dependent on the efficiency of the search algorithm implemented for plan generation and route planning. The results reveal a definite advantage of using Public Transportation over existing delivery approaches in terms of energy efficiency.
Resumo:
Lors du transport du bois de la forêt vers les usines, de nombreux événements imprévus peuvent se produire, événements qui perturbent les trajets prévus (par exemple, en raison des conditions météo, des feux de forêt, de la présence de nouveaux chargements, etc.). Lorsque de tels événements ne sont connus que durant un trajet, le camion qui accomplit ce trajet doit être détourné vers un chemin alternatif. En l’absence d’informations sur un tel chemin, le chauffeur du camion est susceptible de choisir un chemin alternatif inutilement long ou pire, qui est lui-même "fermé" suite à un événement imprévu. Il est donc essentiel de fournir aux chauffeurs des informations en temps réel, en particulier des suggestions de chemins alternatifs lorsqu’une route prévue s’avère impraticable. Les possibilités de recours en cas d’imprévus dépendent des caractéristiques de la chaîne logistique étudiée comme la présence de camions auto-chargeurs et la politique de gestion du transport. Nous présentons trois articles traitant de contextes d’application différents ainsi que des modèles et des méthodes de résolution adaptés à chacun des contextes. Dans le premier article, les chauffeurs de camion disposent de l’ensemble du plan hebdomadaire de la semaine en cours. Dans ce contexte, tous les efforts doivent être faits pour minimiser les changements apportés au plan initial. Bien que la flotte de camions soit homogène, il y a un ordre de priorité des chauffeurs. Les plus prioritaires obtiennent les volumes de travail les plus importants. Minimiser les changements dans leurs plans est également une priorité. Étant donné que les conséquences des événements imprévus sur le plan de transport sont essentiellement des annulations et/ou des retards de certains voyages, l’approche proposée traite d’abord l’annulation et le retard d’un seul voyage, puis elle est généralisée pour traiter des événements plus complexes. Dans cette ap- proche, nous essayons de re-planifier les voyages impactés durant la même semaine de telle sorte qu’une chargeuse soit libre au moment de l’arrivée du camion à la fois au site forestier et à l’usine. De cette façon, les voyages des autres camions ne seront pas mo- difiés. Cette approche fournit aux répartiteurs des plans alternatifs en quelques secondes. De meilleures solutions pourraient être obtenues si le répartiteur était autorisé à apporter plus de modifications au plan initial. Dans le second article, nous considérons un contexte où un seul voyage à la fois est communiqué aux chauffeurs. Le répartiteur attend jusqu’à ce que le chauffeur termine son voyage avant de lui révéler le prochain voyage. Ce contexte est plus souple et offre plus de possibilités de recours en cas d’imprévus. En plus, le problème hebdomadaire peut être divisé en des problèmes quotidiens, puisque la demande est quotidienne et les usines sont ouvertes pendant des périodes limitées durant la journée. Nous utilisons un modèle de programmation mathématique basé sur un réseau espace-temps pour réagir aux perturbations. Bien que ces dernières puissent avoir des effets différents sur le plan de transport initial, une caractéristique clé du modèle proposé est qu’il reste valable pour traiter tous les imprévus, quelle que soit leur nature. En effet, l’impact de ces événements est capturé dans le réseau espace-temps et dans les paramètres d’entrée plutôt que dans le modèle lui-même. Le modèle est résolu pour la journée en cours chaque fois qu’un événement imprévu est révélé. Dans le dernier article, la flotte de camions est hétérogène, comprenant des camions avec des chargeuses à bord. La configuration des routes de ces camions est différente de celle des camions réguliers, car ils ne doivent pas être synchronisés avec les chargeuses. Nous utilisons un modèle mathématique où les colonnes peuvent être facilement et naturellement interprétées comme des itinéraires de camions. Nous résolvons ce modèle en utilisant la génération de colonnes. Dans un premier temps, nous relaxons l’intégralité des variables de décision et nous considérons seulement un sous-ensemble des itinéraires réalisables. Les itinéraires avec un potentiel d’amélioration de la solution courante sont ajoutés au modèle de manière itérative. Un réseau espace-temps est utilisé à la fois pour représenter les impacts des événements imprévus et pour générer ces itinéraires. La solution obtenue est généralement fractionnaire et un algorithme de branch-and-price est utilisé pour trouver des solutions entières. Plusieurs scénarios de perturbation ont été développés pour tester l’approche proposée sur des études de cas provenant de l’industrie forestière canadienne et les résultats numériques sont présentés pour les trois contextes.
Resumo:
Optimization of Carnobacterium divergens V41 growth and bacteriocin activity in a culture medium deprived of animal protein, needs for food bioprotection, was performed by using a statistical approach. In a screening experiment, twelve factors (pH, temperature, carbohydrates, NaCl, yeast extract, soy peptone, sodium acetate, ammonium citrate, magnesium sulphate, manganese sulphate, ascorbic acid and thiamine) were tested for their influence on the maximal growth and bacteriocin activity using a two-level incomplete factorial design with 192 experiments performed in microtiter plate wells. Based on results, a basic medium was developed and three variables (pH, temperature and carbohydrates concentration) were selected for a scale-up study in bioreactor. A 23 complete factorial design was performed, allowing the estimation of linear effects of factors and all the first order interactions. The best conditions for the cell production were obtained with a temperature of 15°C and a carbohydrates concentration of 20 g/l whatever the pH (in the range 6.5-8), and the best conditions for bacteriocin activity were obtained at 15°C and pH 6.5 whatever the carbohydrates concentration (in the range 2-20 g/l). The predicted final count of C. divergens V41 and the bacteriocin activity under the optimized conditions (15°C, pH 6.5, 20 g/l carbohydrates) were 2.4 x 1010 CFU/ml and 819200 AU/ml respectively. C. divergens V41 cells cultivated in the optimized conditions were able to grow in cold-smoked salmon and totally inhibited the growth of Listeria monocytogenes (< 50 CFU g-1) during five weeks of vacuum storage at 4° and 8°C.
Resumo:
The production of natural extracts requires suitable processing conditions to maximize the preservation of the bioactive ingredients. Herein, a microwave-assisted extraction (MAE) process was optimized, by means of response surface methodology (RSM), to maximize the recovery of phenolic acids and flavonoids and obtain antioxidant ingredients from tomato. A 5-level full factorial Box-Behnken design was successfully implemented for MAE optimization, in which the processing time (t), temperature (T), ethanol concentration (Et) and solid/liquid ratio (S/L) were relevant independent variables. The proposed model was validated based on the high values of the adjusted coefficient of determination and on the non-significant differences between experimental and predicted values. The global optimum processing conditions (t=20 min; T=180 ºC; Et=0 %; and S/L=45 g/L) provided tomato extracts with high potential as nutraceuticals or as active ingredients in the design of functional foods. Additionally, the round tomato variety was highlighted as a source of added-value phenolic acids and flavonoids.