927 resultados para Analysis of Algorithms and Problem Complexity
Resumo:
This study examined variations in the Fulton condition factor, chemical composition, and stable isotopes of carbon and nitrogen in the Brazilian freshwater fish cachara (Pseudoplatystoma fasciatum), comparing farmed and wild fish in different seasons. Values for energy, protein, moisture, and Fulton's condition factor were higher for farmed than for wild fish in the rainy season, indicating better nutritional quality; however, these differences were not observed in the dry season. Likewise, we found significant enhancement of delta(15)N in farmed fish in the rainy season but not in the dry season, whereas enhancement of delta(13)C was observed in both seasons. The combined measurement of delta(13)C and delta(15)N provided traceability under all conditions. Our findings show that stable isotope analysis of C and N can be used to trace cachara origin, and that seasonal variations need to be considered when applying chemical and isotopic authentication of fish and fish products. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
This dissertation presents a systematic and analytic overview of most of the information related to stones, minerals, and stone masonry which is found in the corpus of Plutarch of Chaeronea, combined with most of the information on metals and metalworking which is connected to the former. This survey is intended as a first step in the reconstruction of the full landscape of ‘chemical’ ideas occurring in Plutarch’s writings; accordingly, the exposition of the relevant passages, the assessment of their possible interpretations, the discussion on their implications, and their contextualization in the ancient traditions have been conducted with a special interest in the ‘mineralogical’ and ‘metallurgic’ themes developed in the frame of natural philosophy and meteorology. Although in this perspective physical etiology could have come to acquire central prominence, non-etiological information on Plutarch’s ideas on the nature and behaviour of stones and metals has been treated as equally relevant to reach a fuller understanding of how Plutarch conceptualized and visualized them in general, in- and outside the frame of philosophical explanation. Such extensive outline of Plutarch’s ideas on stones and metals is a prerequisite for an accurate inquiry into his use of the two in analogies, metaphors, and symbols: to predispose this kind of research was another aim of the present survey, and this aim has contributed to shape it; moreover, a special attention has been paid to the analysis of analogical and figurative speaking due to the nature itself of a large part of Plutarch’s references to stones and metals, which are either metaphorical, presented in close association with metaphors, or framed in analogies. Much of the information used for the present overview has been extracted —always with supporting argumentation— from the implications of such metaphors and analogies.
Resumo:
Pancreatic cancer (PC) is the seventh leading cause of cancer death. Despite recent therapy advancements, 5-year survival is 11%. Resistance to therapy is common, and no predictive factors, except for BRCA1/2 and PALB2 mutations, can drive treatment selection. Based on the easy isolation of extracellular vesicles (EVs) from blood and the role of EV-borne miRNAs in chemoresistance, we analyzed EVs and their miRNA content in order to identify predictive factors. First, we analyzed samples from 28 PC patients and 7 healthy subjects, in order to establish methods for isolation and analysis of EVs and their miRNA content. We observed a significantly different expression of 28 miRNAs, including oncogenic or tumor suppressor miRNAs, showing the ability of our approach to detect candidate biomarkers. Then, we analyzed samples of 21 advanced PC patients, collected before first-line treatment with gemcitabine + nab-paclitaxel, and compared findings in responders and non-responders. EVs have been analyzed with Nanoparticle tracking analysis, flow cytometry and RNA-Seq; then, laboratory results have been matched with clinical data. Nanoparticle tracking analysis did not show any significant difference. Flow cytometry showed a lower expression of SSE4 and CD81 in responders. Finally, miRNA analysis showed 25 upregulated and 19 downregulated miRNAs in responders. In particular, in responders we observed upregulation of miR-141-3p, miR-141-5p, miR-200a-3p, miR-200b-3p, miR-200c-3p, miR-375-3p, miR-429, miR-545-5p. These miRNAs have targets with a previously reported role in PC. In conclusion, we show the feasibility of the proposed approach to identify EV-derived biomarkers with predictive value for therapy with gemcitabine + nab-paclitaxel in PC. Our findings highlight the possibility to exploit liquid biopsy for personalized treatment in PC, in order to maximize chances of response and patients’ outcome. These findings are worthy of further investigation: in the same setting, with different chemotherapy schedules, and in different disease settings such as preoperative therapy.
Resumo:
In food and beverage industry, packaging plays a crucial role in protecting food and beverages and maintaining their organoleptic properties. Their disposal, unfortunately, is still difficult, mainly because there is a lack of economically viable systems for separating composite and multilayer materials. It is therefore necessary not only to increase research in this area, but also to set up pilot plants and implement these technologies on an industrial scale. LCA (Life Cycle Assessment) can fulfil these purposes. It allows an assessment of the potential environmental impacts associated with a product, service or process. The objective of this thesis work is to analyze the environmental performance of six separation methods, designed for separating the polymeric from the aluminum fraction in multilayered packaging. The first four methods utilize the chemical dissolution technique using Biodiesel, Cyclohexane, 2-Methyltetrahydrofuran (2-MeTHF) and Cyclopentyl-methyl-ether (CPME) as solvents. The last two applied the mechanical delamination technique with surfactant-activated water, using Ammonium laurate and Triethanolamine laurate as surfactants, respectively. For all six methods, the LCA methodology was applied and the corresponding models were built with the GaBi software version 10.6.2.9, specifically for LCA analyses. Unfortunately, due to a lack of data, it was not possible to obtain the results of the dissolution methods with the solvents 2-MeTHF and CPME; for the other methods, however, the individual environmental performances were calculated. Results revealed that the methods with the best environmental performance are method 2, for dissolution methods, and method 5, for delamination methods. This result is confirmed both by the analysis of normalized and weighted results and by the analysis of 'original' results. An hotspots analysis was also conducted.
Resumo:
Although being studied only for few years, Wire and Arc Additive Manufacturing (WAAM) will become the predominant way of producing stainless-steel elements in a near-like future. The analysis and study of such elements has yet to be defined in a proper way, but the projects regarding this subject are innovating more and more thanks to the findings discovered by the latter. This thesis is focused on an initial stage on the analysis of mechanical and geometrical properties of such stainless-steel elements produced by MX3D laboratories in Amsterdam, and to perform a calibration of the design strength values by means of Annex D of Eurocode 0, which talks about the analysis of the semi-probabilistic safety factors, hence the definition of characteristic values. Moreover, after testing the stainless-steel specimens by means of strain gauges and after obtaining mechanical and geometrical properties, a statistical analysis of such properties and an evaluation of characteristic values is performed. After this, there is to execute the calibration of design strength values of WAAM inclined bars and intersections.
Resumo:
There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.
Resumo:
The usefulness of the application of heuristic algorithms in the transportation model, first proposed by Garver, is analysed in relation to planning for the expansion of transmission systems. The formulation of the mathematical model and the solution techniques proposed in the specialised literature are analysed in detail. Starting with the constructive heuristic algorithm proposed by Garver, an extension is made to the problem of multistage planning for transmission systems. The quality of the solutions found by heuristic algorithms for the transportation model is analysed, as are applications in problems of planning transmission systems.
Resumo:
CONTEXT The necessity of specific intervention components for the successful treatment of patients with posttraumatic stress disorder is the subject of controversy. OBJECTIVE To investigate the complexity of clinical problems as a moderator of relative effects between specific and nonspecific psychological interventions. METHODS We included 18 randomized controlled trials, directly comparing specific and nonspecific psychological interventions. We conducted moderator analyses, including the complexity of clinical problems as predictor. RESULTS Our results have confirmed the moderate superiority of specific over nonspecific psychological interventions; however, the superiority was small in studies with complex clinical problems and large in studies with noncomplex clinical problems. CONCLUSIONS For patients with complex clinical problems, our results suggest that particular nonspecific psychological interventions may be offered as an alternative to specific psychological interventions. In contrast, for patients with noncomplex clinical problems, specific psychological interventions are the best treatment option.
Resumo:
The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.
Resumo:
Heat transfer and entropy generation analysis of the thermally developing forced convection in a porous-saturated duct of rectangular cross-section, with walls maintained at a constant and uniform heat flux, is investigated based on the Brinkman flow model. The classical Galerkin method is used to obtain the fully developed velocity distribution. To solve the thermal energy equation, with the effects of viscous dissipation being included, the Extended Weighted Residuals Method (EWRM) is applied. The local (three dimensional) temperature field is solved by utilizing the Green’s function solution based on the EWRM where symbolic algebra is being used for convenience in presentation. Following the computation of the temperature field, expressions are presented for the local Nusselt number and the bulk temperature as a function of the dimensionless longitudinal coordinate, the aspect ratio, the Darcy number, the viscosity ratio, and the Brinkman number. With the velocity and temperature field being determined, the Second Law (of Thermodynamics) aspect of the problem is also investigated. Approximate closed form solutions are also presented for two limiting cases of MDa values. It is observed that decreasing the aspect ratio and MDa values increases the entropy generation rate.
Resumo:
The process of resources systems selection takes an important part in Distributed/Agile/Virtual Enterprises (D/A/V Es) integration. However, the resources systems selection is still a difficult matter to solve in a D/A/VE, as it is pointed out in this paper. Globally, we can say that the selection problem has been equated from different aspects, originating different kinds of models/algorithms to solve it. In order to assist the development of a web prototype tool (broker tool), intelligent and flexible, that integrates all the selection model activities and tools, and with the capacity to adequate to each D/A/V E project or instance (this is the major goal of our final project), we intend in this paper to show: a formulation of a kind of resources selection problem and the limitations of the algorithms proposed to solve it. We formulate a particular case of the problem as an integer programming, which is solved using simplex and branch and bound algorithms, and identify their performance limitations (in terms of processing time) based on simulation results. These limitations depend on the number of processing tasks and on the number of pre-selected resources per processing tasks, defining the domain of applicability of the algorithms for the problem studied. The limitations detected open the necessity of the application of other kind of algorithms (approximate solution algorithms) outside the domain of applicability founded for the algorithms simulated. However, for a broker tool it is very important the knowledge of algorithms limitations, in order to, based on problem features, develop and select the most suitable algorithm that guarantees a good performance.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
Identificación y caracterización del problema: El problema que guía este proyecto, pretende dar respuesta a interrogantes tales como: ¿De qué modo el tipo de actividades que se diseñan, se constituyen en dispositivos posibilitadores de la comprensión de los temas propios de cada asignatura, por parte de los alumnos? A partir de esta pregunta, surge la siguiente: Al momento de resolver las actividades, ¿qué estrategias cognitivas ponen en juego los estudiantes? y ¿cuáles de ellas favorecen procesos de construcción del conocimiento? Hipótesis: - Las asignaturas cuyas actividades están elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, propician aprendizajes significativos por parte de los estudiantes. - Las actividades elaboradas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos requieren de procesos cognitivos más complejos que los que se implementan en las de tipo tradicional. Objetivo: - Identificar el impacto que tienen las actividades de aprendizaje de tipo tradicional y las elaboradas bajo la metodología de Aprendizaje Basado en Problemas y Estudio de Casos, en el aprendizaje de los alumnos. Materiales y Métodos: a) Análisis de las actividades de aprendizaje del primero y segundo año de la carrera de Abogacía, bajo lamodalidad a Distancia. b) Entrevistas tanto a docentes contenidistas como así también a los tutores. c) Encuestas y entrevistas a los alumnos. Resultados esperados: Se pretende confirmar que las actividades de aprendizaje, diseñadas bajo la metodología del Aprendizaje Basado en Problemas y el Estudio de Casos, promueven aprendizajes significativos en los alumnos. Importancia del proyecto y pertinencia: La relevancia del presente proyecto se podría identificar a través de dos grandes variables vinculadas entre sí: la relacionada con el dispositivo didáctico (estrategias implementadas por los alumnos) y la referida a lo institucional (carácter innovador de la propuesta de enseñanza y posibilidad de extenderla a otras cátedras). El presente proyecto pretende implementar mejoras en el diseño de las actividades de aprendizaje, a fin de promover en los alumnos la generación de ideas y soluciones responsables y el desarrollo de su capacidad analítica y reflexiva.
Resumo:
The aim of this study was to propose a methodology allowing a detailed characterization of body sit-to-stand/stand-to-sit postural transition. Parameters characterizing the kinematics of the trunk movement during sit-to-stand (Si-St) postural transition were calculated using one initial sensor system fixed on the trunk and a data logger. Dynamic complexity of these postural transitions was estimated by fractal dimension of acceleration-angular velocity plot. We concluded that this method provides a simple and accurate tool for monitoring frail elderly and to objectively evaluate the efficacy of a rehabilitation program.