988 resultados para continuous nonlinear programming
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
This project is called Improvement Logistics Project and aims to study an opportunity of expansion of the output in 80% of the Unilever warehouse at Sta. Iria via an increase in exportations for the next two years. This has been done using the Distibuidora Luís Simões tariff rates as basis of comparison for the as-is and to-be situations. For this matter, an allocation of all the costs of the warehouse is prepared and described with the goal of comparing the differences with and without expansion. The results show that a better outcome is achieved with the investment, but the warehouse is yet to prove its efficiency against the distribution company.
Resumo:
This Work Project studies the Continuous Improvement and Processes (CIP) department at TAP Maintenance & Engineering. The project has the objective to provide insights to align the activities of the department with the strategy of the organization. For such, two focuses were taken: (i) an internal analysis which highlighted a need for transversal change to ensure the adoption of Continuous Improvement at TAP, and (ii) a process which outlined objectives and projects to be pursued to prioritize CIP’s activities in accordance with the organization’s goals. The outcome includes (a) important recommendations concerning strategic planning and competition evaluation and (b) a process’ output that reflects a balance among factors influencing the priority of projects.
Resumo:
Machine ethics is an interdisciplinary field of inquiry that emerges from the need of imbuing autonomous agents with the capacity of moral decision-making. While some approaches provide implementations in Logic Programming (LP) systems, they have not exploited LP-based reasoning features that appear essential for moral reasoning. This PhD thesis aims at investigating further the appropriateness of LP, notably a combination of LP-based reasoning features, including techniques available in LP systems, to machine ethics. Moral facets, as studied in moral philosophy and psychology, that are amenable to computational modeling are identified, and mapped to appropriate LP concepts for representing and reasoning about them. The main contributions of the thesis are twofold. First, novel approaches are proposed for employing tabling in contextual abduction and updating – individually and combined – plus a LP approach of counterfactual reasoning; the latter being implemented on top of the aforementioned combined abduction and updating technique with tabling. They are all important to model various issues of the aforementioned moral facets. Second, a variety of LP-based reasoning features are applied to model the identified moral facets, through moral examples taken off-the-shelf from the morality literature. These applications include: (1) Modeling moral permissibility according to the Doctrines of Double Effect (DDE) and Triple Effect (DTE), demonstrating deontological and utilitarian judgments via integrity constraints (in abduction) and preferences over abductive scenarios; (2) Modeling moral reasoning under uncertainty of actions, via abduction and probabilistic LP; (3) Modeling moral updating (that allows other – possibly overriding – moral rules to be adopted by an agent, on top of those it currently follows) via the integration of tabling in contextual abduction and updating; and (4) Modeling moral permissibility and its justification via counterfactuals, where counterfactuals are used for formulating DDE.
Resumo:
The work described in this thesis was performed at the Laboratory for Intense Lasers (L2I) of Instituto Superior Técnico, University of Lisbon (IST-UL). Its main contribution consists in the feasibility study of the broadband dispersive stages for an optical parametric chirped pulse amplifier based on the nonlinear crystal yttrium calcium oxi-borate (YCOB). In particular, the main goal of this work consisted in the characterization and implementation of the several optical devices involved in pulse expansion and compression of the amplified pulses to durations of the order of a few optical cycles (20 fs). This type of laser systems find application in fields such as medicine, telecommunications and machining, which require high energy, ultrashort (sub-100 fs) pulses. The main challenges consisted in the preliminary study of the performance of the broadband amplifier, which is essential for successfully handling pulses with bandwidths exceeding 100 nm when amplified from the μJ to 20 mJ per pulse. In general, the control, manipulation and characterization of optical phenomena on the scale of a few tens of fs and powers that can reach the PW level are extremely difficult and challenging due to the complexity of the phenomena of radiation-matter interaction and their nonlinearities, observed at this time scale and power level. For this purpose the main dispersive components were characterized in detail, specifically addressing the demonstration of pulse expansion and compression. The tested bandwidths are narrower than the final ones, in order to confirm the parameters of these elements and predict the performance for the broadband pulses. The work performed led to additional tasks such as a detailed characterization of laser oscillator seeding the laser chain and the detection and cancelling of additional sources of dispersion.
Resumo:
The theme of this dissertation is the finite element method applied to mechanical structures. A new finite element program is developed that, besides executing different types of structural analysis, also allows the calculation of the derivatives of structural performances using the continuum method of design sensitivities analysis, with the purpose of allowing, in combination with the mathematical programming algorithms found in the commercial software MATLAB, to solve structural optimization problems. The program is called EFFECT – Efficient Finite Element Code. The object-oriented programming paradigm and specifically the C ++ programming language are used for program development. The main objective of this dissertation is to design EFFECT so that it can constitute, in this stage of development, the foundation for a program with analysis capacities similar to other open source finite element programs. In this first stage, 6 elements are implemented for linear analysis: 2-dimensional truss (Truss2D), 3-dimensional truss (Truss3D), 2-dimensional beam (Beam2D), 3-dimensional beam (Beam3D), triangular shell element (Shell3Node) and quadrilateral shell element (Shell4Node). The shell elements combine two distinct elements, one for simulating the membrane behavior and the other to simulate the plate bending behavior. The non-linear analysis capability is also developed, combining the corotational formulation with the Newton-Raphson iterative method, but at this stage is only avaiable to solve problems modeled with Beam2D elements subject to large displacements and rotations, called nonlinear geometric problems. The design sensitivity analysis capability is implemented in two elements, Truss2D and Beam2D, where are included the procedures and the analytic expressions for calculating derivatives of displacements, stress and volume performances with respect to 5 different design variables types. Finally, a set of test examples were created to validate the accuracy and consistency of the result obtained from EFFECT, by comparing them with results published in the literature or obtained with the ANSYS commercial finite element code.
Resumo:
INTRODUCTION:With the ease provided by current computational programs, medical and scientific journals use bar graphs to describe continuous data.METHODS:This manuscript discusses the inadequacy of bars graphs to present continuous data.RESULTS:Simulated data show that box plots and dot plots are more-feasible tools to describe continuous data.CONCLUSIONS:These plots are preferred to represent continuous variables since they effectively describe the range, shape, and variability of observations and clearly identify outliers. By contrast, bar graphs address only measures of central tendency. Bar graphs should be used only to describe qualitative data.
Resumo:
PURPOSE: Enteral alimentation is the preferred modality of support in critical patients who have acceptable digestive function and are unable to eat orally, but the advantages of continuous versus intermittent administration are surrounded by controversy. With the purpose of identifying the benefits and complications of each technique, a prospective controlled study with matched subjects was conducted. PATIENTS AND METHODS: Twenty-eight consecutive candidates for enteral feeding were divided into 2 groups (n = 14 each) that were matched for diagnosis and APACHE II score. A commercial immune-stimulating polymeric diet was administered via nasogastric tube by electronic pump in the proportion of 25 kcal/kg/day, either as a 1-hour bolus every 3 hours (Group I), or continuously for 24 hours (Group II), over a 3-day period. Anthropometrics, biochemical measurements, recording of administered drugs and other therapies, thorax X-ray, measurement of abdominal circumference, monitoring of gastric residue, and clinical and nutritional assessments were performed at least once daily. The principal measured outcomes of this protocol were frequency of abdominal distention and pulmonary aspiration, and efficacy in supplying the desired amount of nutrients. RESULTS: Nearly half of the total population (46.4%) exhibited high gastric residues on at least 1 occasion, but only 1 confirmed episode of pulmonary aspiration occurred (3.6%). Both groups displayed a moderate number of complications, without differences. Food input during the first day was greater in Group II (approximately 20% difference), but by the third day, both groups displayed similarly small deficits in total furnished volume of about 10%, when compared with the prescribed diet. CONCLUSIONS: Both administration modalities permitted practical and effective administration of the diet with frequent registered abnormalities but few clinically significant problems. The two groups were similar in this regard, without statistical differences, probably because of meticulous technique, careful monitoring, strict patient matching, and conservative amounts of diet employed in both situations. Further studies with additional populations, diagnostic groups, and dietetic prescriptions should be performed in order to elucidate the differences between these commonly used feeding modalities.
Resumo:
Despite the extensive literature in finding new models to replace the Markowitz model or trying to increase the accuracy of its input estimations, there is less studies about the impact on the results of using different optimization algorithms. This paper aims to add some research to this field by comparing the performance of two optimization algorithms in drawing the Markowitz Efficient Frontier and in real world investment strategies. Second order cone programming is a faster algorithm, appears to be more efficient, but is impossible to assert which algorithm is better. Quadratic Programming often shows superior performance in real investment strategies.
Resumo:
Population densities of six primate species (Saguinus midas, Pithecia pithecia, Cebus apella, Chiropotes satanas, Alouatta seniculus and Ateles paniscus) were estimated in continuous forest and in isolated reserves (one of 100 ha and four of 10 ha). Saguinusdensities in the continuous forest were found to be low, probably due to the lack of edge habitat and second growth favoured by them; Pithecia, Cebus and Ateles populations are also low, possibly because of more widely distributed and/or less abundant food sources than is true for other Amazonian regions, although hunting in the past, particularly of Ateles may also be a contributing factor; and Chiropotes and Alouatta densities were found to be similar to those observed in other areas of Amazonas forests. Ateles and Chiropotes, which occupy ranges on the order of three km2 were excluded from the 100-ka reserve at the time of its isolation. Unfortunately populations were not known prior to isolation of this reserve but during isolation there remained four groups of Saguinus, two Pitheciagroups, one Cebus groups and five Alouatta groups. One Saguinus group disappeared two months later, and one year post-isolation the Cebus group also left the reserve. Single Alouatta groups survive in the isolated 10-ha reserves. Saguinus, present in the four 10-ha reserves following isolation, have disappeared from two of them. One 10-ha reserve retains a group of Pithecia.
Resumo:
Recaí sob a responsabilidade da Marinha Portuguesa a gestão da Zona Económica Exclusiva de Portugal, assegurando a sua segurança da mesma face a atividades criminosas. Para auxiliar a tarefa, é utilizado o sistema Oversee, utilizado para monitorizar a posição de todas as embarcações presentes na área afeta, permitindo a rápida intervenção da Marinha Portuguesa quando e onde necessário. No entanto, o sistema necessita de transmissões periódicas constantes originadas nas embarcações para operar corretamente – casos as transmissões sejam interrompidas, deliberada ou acidentalmente, o sistema deixa de conseguir localizar embarcações, dificultando a intervenção da Marinha. A fim de colmatar esta falha, é proposto adicionar ao sistema Oversee a capacidade de prever as posições futuras de uma embarcação com base no seu trajeto até à cessação das transmissões. Tendo em conta os grandes volumes de dados gerados pelo sistema (históricos de posições), a área de Inteligência Artificial apresenta uma possível solução para este problema. Atendendo às necessidades de resposta rápida do problema abordado, o algoritmo de Geometric Semantic Genetic Programming baseado em referências de Vanneschi et al. apresenta-se como uma possível solução, tendo já produzido bons resultados em problemas semelhantes. O presente trabalho de tese pretende integrar o algoritmo de Geometric Semantic Genetic Programming desenvolvido com o sistema Oversee, a fim de lhe conceder capacidades preditivas. Adicionalmente, será realizado um processo de análise de desempenho a fim de determinar qual a ideal parametrização do algoritmo. Pretende-se com esta tese fornecer à Marinha Portuguesa uma ferramenta capaz de auxiliar o controlo da Zona Económica Exclusiva Portuguesa, permitindo a correta intervenção da Marinha em casos onde o atual sistema não conseguiria determinar a correta posição da embarcação em questão.
Resumo:
During the last decade Mongolia’s region was characterized by a rapid increase of both severity and frequency of drought events, leading to pasture reduction. Drought monitoring and assessment plays an important role in the region’s early warning systems as a way to mitigate the negative impacts in social, economic and environmental sectors. Nowadays it is possible to access information related to the hydrologic cycle through remote sensing, which provides a continuous monitoring of variables over very large areas where the weather stations are sparse. The present thesis aimed to explore the possibility of using NDVI as a potential drought indicator by studying anomaly patterns and correlations with other two climate variables, LST and precipitation. The study covered the growing season (March to September) of a fifteen year period, between 2000 and 2014, for Bayankhongor province in southwest Mongolia. The datasets used were MODIS NDVI, LST and TRMM Precipitation, which processing and analysis was supported by QGIS software and Python programming language. Monthly anomaly correlations between NDVI-LST and NDVI-Precipitation were generated as well as temporal correlations for the growing season for known drought years (2001, 2002 and 2009). The results show that the three variables follow a seasonal pattern expected for a northern hemisphere region, with occurrence of the rainy season in the summer months. The values of both NDVI and precipitation are remarkably low while LST values are high, which is explained by the region’s climate and ecosystems. The NDVI average, generally, reached higher values with high precipitation values and low LST values. The year of 2001 was the driest year of the time-series, while 2003 was the wet year with healthier vegetation. Monthly correlations registered weak results with low significance, with exception of NDVI-LST and NDVI-Precipitation correlations for June, July and August of 2002. The temporal correlations for the growing season also revealed weak results. The overall relationship between the variables anomalies showed weak correlation results with low significance, which suggests that an accurate answer for predicting drought using the relation between NDVI, LST and Precipitation cannot be given. Additional research should take place in order to achieve more conclusive results. However the NDVI anomaly images show that NDVI is a suitable drought index for Bayankhongor province.
Resumo:
Injectable biomaterials with in situ cross-linking reactions have been suggested to minimize the invasiveness associated with most implantation procedures. However, problems related with the rapid liquid-to-gel transition reaction can arise because it is difficult to predict the reliability of the reaction and its end products, as well as to mitigate cytotoxicity to the surrounding tissues. An alternative minimally invasive approach to deliver solid implants in vivo is based on injectable microparticles, which can be processed in vitro with high fidelity and reliability, while showing low cytotoxicity. Their delivery to the defect can be performed by injection through a small diameter syringe needle. We present a new methodology for the continuous, solvent- and oil-free production of photopolymerizable microparticles containing encapsulated human dermal fibroblasts. A precursor solution of cells in photo-reactive PEG-fibrinogen (PF) polymer was transported through a transparent injector exposed to light-irradiation before being atomized in a jet-in-air nozzle. Shear rheometry data provided the cross-linking kinetics of each PF/cell solution, which was then used to determine the amount of irradiation required to partially polymerize the mixture prior to atomization. The partially polymerized drops fell into a gelation bath for further polymerization. The system was capable of producing cell-laden microparticles with high cellular viability, with an average diameter of between 88.1 µm to 347.1 µm and a dispersity of between 1.1 and 2.4, depending on the parameters chosen.
Resumo:
Relatório de estágio de mestrado em Ensino de Matemática no 3.º Ciclo do Ensino Básico e no Ensino Secundário