881 resultados para large course design
Resumo:
Prostate cancer (CaP) is the most diagnosed non-cutaneous malignancy and the second leading cause of cancer mortality among United States males. Major racial disparities in incidence, survival, as well as treatment persist. The mortality is three times higher among African Americans (AAs) compared with Caucasians. Androgen carcinogenesis has been persistently implicated but results are inconsistent; and hormone manipulation has been the main stay of treatment for metastatic disease, supportive of the androgen carcinogenesis. The survival disadvantage of AAs has been attributed to the differences in socioeconomic factors (SES), tumor stage, and treatment. We hypostasized that HT prolongs survival in CaP and that the racial disparities in survival is influenced by variation in HT and primary therapies as well as SES. To address these overall hypothesis, we first utilized a random-effect meta-analytic design to examine evidence from randomized trials on the efficacy of androgen deprivation therapy in localized and metastatic disease, and assessed, using Cox proportional hazards models, the effectiveness of HT in prolonging survival in a large community-based cohort of older males diagnosed with local/regional CaP. Further we examined the role of HT and primary therapies on the racial disparities in CaP survival. The results indicated that adjuvant HT compared with standard care alone is efficacious in improving overall survival, whereas HT has no significant benefit in the real world experience in increasing the overall survival of older males in the community treated for local/regional disease. Further, racial differences in survival persist and were explained to some extent by the differences in the primary therapies (radical prostatectomy, radiation and watchful waiting) and largely by SES. Therefore, given the increased used of hormonal therapy and the cost-effectiveness today, more RCTs are needed to assess whether or not survival prolongation translates to improved quality of life, and to answer the research question on whether or not the decreased use of radical prostatectomy by AAs is driven by the Clinicians bias or AAs's preference of conservative therapy and to encourage AAs to seek curative therapies, thus narrowing to some degree the persistent mortality disparities between AAs and Caucasians. ^
Resumo:
The study aim was to determine whether using automated side loader (ASL) trucks in higher proportions compared to other types of trucks for residential waste collection results in lower injury rates (from all causes). The primary hypothesis was that the risk of injury to workers was lower for those who work with ASL trucks than for workers who work with other types of trucks used in residential waste collection. To test this hypothesis, data were collected from one of the nation’s largest companies in the solid waste management industry. Different local operating units (i.e. facilities) in the company used different types of trucks to varying degrees, which created a special opportunity to examine refuse collection injuries and illnesses and the risk reduction potential of ASL trucks.^ The study design was ecological and analyzed end-of-year data provided by the company for calendar year 2007. During 2007, there were a total of 345 facilities which provided residential services. Each facility represented one observation.^ The dependent variable – injury and illness rate, was defined as a facility’s total case incidence rate (TCIR) recorded in accordance with federal OSHA requirements for the year 2007. The TCIR is the rate of total recordable injury and illness cases per 100 full-time workers. The independent variable, percent of ASL trucks, was calculated by dividing the number of ASL trucks by the total number of residential trucks at each facility.^ Multiple linear regression models were estimated for the impact of the percent of ASL trucks on TCIR per facility. Adjusted analyses included three covariates: median number of hours worked per week for residential workers; median number of months of work experience for residential workers; and median age of residential workers. All analyses were performed with the statistical software, Stata IC (version 11.0).^ The analyses included three approaches to classifying exposure, percent of ASL trucks. The first approach included two levels of exposure: (1) 0% and (2) >0 - <100%. The second approach included three levels of exposure: (1) 0%, (2) ≥ 1 - < 100%, and (3) 100%. The third approach included six levels of exposure to improve detection of a dose-response relationship: (1) 0%, (2) 1 to <25%, (3) 25 to <50%, (4) 50 to <75%, (5) 75 to <100%, and (6) 100%. None of the relationships between injury and illness rate and percent ASL trucks exposure levels was statistically significant (i.e., p<0.05), even after adjustment for all three covariates.^ In summary, the present study shows that there is some risk reduction impact of ASL trucks but not statistically significant. The covariates demonstrated a varied yet more modest impact on the injury and illness rate but again, none of the relationships between injury and illness rate and the covariates were statistically significant (i.e., p<0.05). However, as an ecological study, the present study also has the limitations inherent in such designs and warrants replication in an individual level cohort design. Any stronger conclusions are not suggested.^
Resumo:
The development of targeted therapy involve many challenges. Our study will address some of the key issues involved in biomarker identification and clinical trial design. In our study, we propose two biomarker selection methods, and then apply them in two different clinical trial designs for targeted therapy development. In particular, we propose a Bayesian two-step lasso procedure for biomarker selection in the proportional hazards model in Chapter 2. In the first step of this strategy, we use the Bayesian group lasso to identify the important marker groups, wherein each group contains the main effect of a single marker and its interactions with treatments. In the second step, we zoom in to select each individual marker and the interactions between markers and treatments in order to identify prognostic or predictive markers using the Bayesian adaptive lasso. In Chapter 3, we propose a Bayesian two-stage adaptive design for targeted therapy development while implementing the variable selection method given in Chapter 2. In Chapter 4, we proposed an alternate frequentist adaptive randomization strategy for situations where a large number of biomarkers need to be incorporated in the study design. We also propose a new adaptive randomization rule, which takes into account the variations associated with the point estimates of survival times. In all of our designs, we seek to identify the key markers that are either prognostic or predictive with respect to treatment. We are going to use extensive simulation to evaluate the operating characteristics of our methods.^
Resumo:
The ROV operations had three objectives: (1) to check, whether the "Cherokee" system is suited for advanced benthological work in the high latitude Antarctic shelf areas; (2) to support the disturbance experiment, providing immediate visual Information; (3) to continue ecological work that started in 1989 at the hilltop situated at the northern margin of the Norsel Bank off the 4-Seasons Inlet (Weddell Sea). The "Cherokee" is was equipped with 3 video cameras, 2 of which support the operation. A high resolution Tritech Typhoon camera is used for scientific observations to be recorded. In addition, the ROV has a manipulator, a still camera, lights and strobe, compass, 2 lasers, a Posidonia transponder and an obstacle avoidance Sonar. The size of the vehicle is 160 X 90 X 90cm. In the present configuration without TMS (tether management system) the deployment has to start with paying out the full cable length, lay it in loops on deck and connect the glass fibres at the tether's spool winch. After a final technical check the vehicle is deployed into the water, actively driven perpendicular to the ship's axis and floatings are fixed to the tether. At a cable length of approx. 50 m, the tether is tightened to the depressor by several cable ties and both components are lowered towards the sea floor, the vehicle by the thruster's propulsion and the depressor by the ship's winch. At 5 m intervals the tether has to be tied to the single conductor cable. In good weather conditions the instruments supporting the navigation of the ROV, especially the Posidonia system, allow an operation mode to follow the ship's course if the ship's speed is slow. Together with the lasers which act as a scale in the images they also allow a reproducible scientific analysis since the transect can be plotted in a GIS system. Consequently, the area observed can be easily calculated. An operation as a predominantly drifting system, especially in areas with bottom near currents, is also possible, however, the connection of the tether at the rear of the vehicle is unsuitable for such conditions. The recovery of the system corresponds to that of the deployment. Most important is to reach the surface of the sea at a safe distance perpendicular to the ship's axis in order not to interfere with the ship's propellers. During this phase the Posidonia transponder system is of high relevance although it has to be switched off at a water depth of approx. 40 m. The minimum personal needed is 4 persons to handle the tether on deck, one person to operate the ship's winch, one pilot and one additional technician for the ROV's operation itself, one scientist, and one person on the ship's bridge in addition to one on deck for whale watching when the Posidonia system is in use. The time for the deployment of the ROV until it reaches the sea floor depends on the water depth and consequently on the length of the cable to be paid out beforehand and to be tightened to the single conductor cable. Deployment and recovery at intermediate water depths can last up to 2 hours each. A reasonable time for benthological observations close to the sea floor is 1 to 3 hours but can be extended if scientifically justified. Preliminary results: after a first test station, the ROV was deployed 3 times for observations related to the disturbance experiment. A first attempt to Cross the hilltop at the northern margin of the Norsel Bank close to the 4- Seasons Inlet was successful only for the first hundreds of metres transect length. The benthic community was dominated in biomass by the demosponge Cinachyra barbata. Due to the strong current of approx. 1 nm/h, the design of the system, and an expected more difficult current regime between grounded icebergs and the top of the hilltop the operation was stopped before the hilltop was reached. In a second attempt the hilltop was successfully crossed because the current and wind situation was much more suitable. In contrast to earlier expeditions with the "sprint" ROV it was the first time that both slopes, the smoother in the northeast and the steeper in the southwest were continuously observed during one cast. A coarse classification of the hilltop fauna shows patches dominated by single taxa: cnidarians, hydrozoans, holothurians, sea urchins and stalked sponges. Approximately 20 % of the north-eastern slope was devastated by grounding icebergs. Here the sediments consisted of large boulders, gravel or blocks of finer sediment looking like an irregularly ploughed field. On the Norsel Bank the Cinachyra concentrations were locally associated with high abundances of sea anemones. Total observation time amounted to 11.5 hours corresponding to almost 6-9 km transect length.
Resumo:
Despite the professed claims of microcredit alleviating poverty, little is known about what kind of credit contract is suitable for extremely poor households, also called the ultra-poor. To fill this knowledge gap, we initiated a field experiment in the river islands of northern Bangladesh, where a substantial portion of dwellers could be categorized as ultra-poor due to cyclic floods. We randomly offered four types of loans to such dwellers: regular small cash loans with one-year maturity, large cash loans with three-year maturity both with and without a one-year grace period, and in-kind livestock loans with three-year maturity and a one-year grace period. We compared uptake rates as well as the determinants of uptake and found that the uptake rate is the lowest for the regular contract, followed by the in-kind contract. Contrary to prior belief, we also found that the microcredit demand by the ultra-poor is not necessarily small, and in particular the ultra-poor are significantly more likely to join a microcredit program than the moderately poor if a grace period with longer maturity is attached to a large amount of credit, irrespective of whether the credit is provided in cash or in kind. This paper provides evidence that a typical microcredit contract with one-year maturity and without a grace period is not attractive to the ultra-poor. Microfinance institutions may need to design better credit contracts to address the poor's needs.
Resumo:
This thesis contributes to the analysis and design of printed reflectarray antennas. The main part of the work is focused on the analysis of dual offset antennas comprising two reflectarray surfaces, one of them acts as sub-reflector and the second one acts as mainreflector. These configurations introduce additional complexity in several aspects respect to conventional dual offset reflectors, however they present a lot of degrees of freedom that can be used to improve the electrical performance of the antenna. The thesis is organized in four parts: the development of an analysis technique for dualreflectarray antennas, a preliminary validation of such methodology using equivalent reflector systems as reference antennas, a more rigorous validation of the software tool by manufacturing and testing a dual-reflectarray antenna demonstrator and the practical design of dual-reflectarray systems for some applications that show the potential of these kind of configurations to scan the beam and to generate contoured beams. In the first part, a general tool has been implemented to analyze high gain antennas which are constructed of two flat reflectarray structures. The classic reflectarray analysis based on MoM under local periodicity assumption is used for both sub and main reflectarrays, taking into account the incident angle on each reflectarray element. The incident field on the main reflectarray is computed taking into account the field radiated by all the elements on the sub-reflectarray.. Two approaches have been developed, one which employs a simple approximation to reduce the computer run time, and the other which does not, but offers in many cases, improved accuracy. The approximation is based on computing the reflected field on each element on the main reflectarray only once for all the fields radiated by the sub-reflectarray elements, assuming that the response will be the same because the only difference is a small variation on the angle of incidence. This approximation is very accurate when the reflectarray elements on the main reflectarray show a relatively small sensitivity to the angle of incidence. An extension of the analysis technique has been implemented to study dual-reflectarray antennas comprising a main reflectarray printed on a parabolic surface, or in general in a curved surface. In many applications of dual-reflectarray configurations, the reflectarray elements are in the near field of the feed-horn. To consider the near field radiated by the horn, the incident field on each reflectarray element is computed using a spherical mode expansion. In this region, the angles of incidence are moderately wide, and they are considered in the analysis of the reflectarray to better calculate the actual incident field on the sub-reflectarray elements. This technique increases the accuracy for the prediction of co- and cross-polar patterns and antenna gain respect to the case of using ideal feed models. In the second part, as a preliminary validation, the proposed analysis method has been used to design a dual-reflectarray antenna that emulates previous dual-reflector antennas in Ku and W-bands including a reflectarray as subreflector. The results for the dualreflectarray antenna compare very well with those of the parabolic reflector and reflectarray subreflector; radiation patterns, antenna gain and efficiency are practically the same when the main parabolic reflector is substituted by a flat reflectarray. The results show that the gain is only reduced by a few tenths of a dB as a result of the ohmic losses in the reflectarray. The phase adjustment on two surfaces provided by the dual-reflectarray configuration can be used to improve the antenna performance in some applications requiring multiple beams, beam scanning or shaped beams. Third, a very challenging dual-reflectarray antenna demonstrator has been designed, manufactured and tested for a more rigorous validation of the analysis technique presented. The proposed antenna configuration has the feed, the sub-reflectarray and the main-reflectarray in the near field one to each other, so that the conventional far field approximations are not suitable for the analysis of such antenna. This geometry is used as benchmarking for the proposed analysis tool in very stringent conditions. Some aspects of the proposed analysis technique that allow improving the accuracy of the analysis are also discussed. These improvements include a novel method to reduce the inherent cross polarization which is introduced mainly from grounded patch arrays. It has been checked that cross polarization in offset reflectarrays can be significantly reduced by properly adjusting the patch dimensions in the reflectarray in order to produce an overall cancellation of the cross-polarization. The dimensions of the patches are adjusted in order not only to provide the required phase-distribution to shape the beam, but also to exploit the crosses by zero of the cross-polarization components. The last part of the thesis deals with direct applications of the technique described. The technique presented is directly applicable to the design of contoured beam antennas for DBS applications, where the requirements of cross-polarisation are very stringent. The beam shaping is achieved by synthesithing the phase distribution on the main reflectarray while the sub-reflectarray emulates an equivalent hyperbolic subreflector. Dual-reflectarray antennas present also the ability to scan the beam over small angles about boresight. Two possible architectures for a Ku-band antenna are also described based on a dual planar reflectarray configuration that provides electronic beam scanning in a limited angular range. In the first architecture, the beam scanning is achieved by introducing a phase-control in the elements of the sub-reflectarray and the mainreflectarray is passive. A second alternative is also studied, in which the beam scanning is produced using 1-bit control on the main reflectarray, while a passive subreflectarray is designed to provide a large focal distance within a compact configuration. The system aims to develop a solution for bi-directional satellite links for emergency communications. In both proposed architectures, the objective is to provide a compact optics and simplicity to be folded and deployed.
Resumo:
Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.
Resumo:
This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.
Resumo:
For years, the Human Computer Interaction (HCI) community has crafted usability guidelines that clearly define what characteristics a software system should have in order to be easy to use. However, in the Software Engineering (SE) community keep falling short of successfully incorporating these recommendations into software projects. From a SE perspective, the process of incorporating usability features into software is not always straightforward, as a large number of these features have heavy implications in the underlying software architecture. For example, successfully including an “undo” feature in an application requires the design and implementation of many complex interrelated data structures and functionalities. Our work is focused upon providing developers with a set of software design patterns to assist them in the process of designing more usable software. This would contribute to the proper inclusion of specific usability features with high impact on the software design. Preliminary validation data show that usage of the guidelines also has positive effects on development time and overall software design quality.
Resumo:
This paper describes a practical activity, part of the renewable energy course where the students have to build their own complete wind generation system, including blades, PM-generator, power electronics and control. After connecting the system to the electric grid the system has been tested during real wind scenarios. The paper will describe the electric part of the work surface-mounted permanent magnet machine design criteria as well as the power electronics part for the power control and the grid connection. A Kalman filter is used for the voltage phase estimation and current commands obtained in order to control active and reactive power. The connection to the grid has been done and active and reactive power has been measured in the system.
Resumo:
Lately the short-wave infrared (SWIR) has become very important due to the recent appearance on the market of the small detectors with a large focal plane array. Military applications for SWIR cameras include handheld and airborne systems with long range detection requirements, but where volume and weight restrictions must be considered. In this paper we present three different designs of telephoto objectives that have been designed according to three different methods. Firstly the conventional method where the starting point of the design is an existing design. Secondly we will face design starting from the design of an aplanatic system. And finally the simultaneous multiple surfaces (SMS) method, where the starting point is the input wavefronts that we choose. The designs are compared in terms of optical performance, volume, weight and manufacturability. Because the objectives have been designed for the SWIR waveband, the color correction has important implications in the choice of glass that will be discussed in detail
Resumo:
A high productivity rate in Engineering is related to an efficient management of the flow of the large quantities of information and associated decision making activities that are consubstantial to the Engineering processes both in design and production contexts. Dealing with such problems from an integrated point of view and mimicking real scenarios is not given much attention in Engineering degrees. In the context of Engineering Education, there are a number of courses designed for developing specific competencies, as required by the academic curricula, but not that many in which integration competencies are the main target. In this paper, a course devoted to that aim is discussed. The course is taught in a Marine Engineering degree but the philosophy could be used in any Engineering field. All the lessons are given in a computer room in which every student can use each all the treated software applications. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using Ms-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity, ending up with the case of vessel construction. The second part of the course is dedicated to the use of a database manager, Ms-ACCESS, for managing production related information. A series of increasing complexity examples is treated ending up with the management of the pipe database of a real vessel. This database consists of a few thousand of pipes, for which a production timing frame is defined, which connects this part of the course with the first one. Finally, the third part of the course is devoted to the work with FORAN, an Engineering Production package of widespread use in the shipbuilding industry. With this package, the frames and plates where all the outfitting will be carried out are defined through cooperative work by the studens, working simultaneously in the same 3D model. In the paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feed-back from their experience as well as to assess their satisfaction with the learning process. Results from these surveys are discussed in the paper
Resumo:
Se presenta un nuevo método de diseño conceptual en Ingeniería Aeronáutica basado el uso de modelos reducidos, también llamados modelos sustitutos (‘surrogates’). Los ingredientes de la función objetivo se calculan para cada indiviudo mediante la utilización de modelos sustitutos asociados a las distintas disciplinas técnicas que se construyen mediante definiciones de descomposición en valores singulares de alto orden (HOSVD) e interpolaciones unidimensionales. Estos modelos sustitutos se obtienen a partir de un número limitado de cálculos CFD. Los modelos sustitutos pueden combinarse, bien con un método de optimización global de tipo algoritmo genético, o con un método local de tipo gradiente. El método resultate es flexible a la par que mucho más eficiente, computacionalmente hablando, que los modelos convencionales basados en el cálculo directo de la función objetivo, especialmente si aparecen un gran número de parámetros de diseño y/o de modelado. El método se ilustra considerando una versión simplificada del diseño conceptual de un avión. Abstract An optimization method for conceptual design in Aeronautics is presented that is based on the use of surrogate models. The various ingredients in the target function are calculated for each individual using surrogates of the associated technical disciplines that are constructed via high order singular value decomposition and one dimensional interpolation. These surrogates result from a limited number of CFD calculated snapshots. The surrogates are combined with an optimization method, which can be either a global optimization method such as a genetic algorithm or a local optimization method, such as a gradient-like method. The resulting method is both flexible and much more computationally efficient than the conventional method based on direct calculation of the target function, especially if a large number of free design parameters and/or tunablemodeling parameters are present. The method is illustrated considering a simplified version of the conceptual design of an aircraft empennage.
Resumo:
We informally discuss several issues related to the parallel execution of logic programming systems and concurrent logic programming systems, and their generalization to constraint programming. We propose a new view of these systems, based on a particular definition of parallelism. We argüe that, under this view, a large number of the actual systems and models can be explained through the application, at different levéis of granularity, of only a few basic principies: determinism, non-failure, independence (also referred to as stability), granularity, etc. Also, and based on the convergence of concepts that this view brings, we sketch a model for the implementation of several parallel constraint logic programming source languages and models based on a common, generic abstract machine and an intermedíate kernel language.
Resumo:
We describe some of the novel aspects and motivations behind the design and implementation of the Ciao multiparadigm programming system. An important aspect of Ciao is that it provides the programmer with a large number of useful features from different programming paradigms and styles, and that the use of each of these features can be turned on and off at will for each program module. Thus, a given module may be using e.g. higher order functions and constraints, while another module may be using objects, predicates, and concurrency. Furthermore, the language is designed to be extensible in a simple and modular way. Another important aspect of Ciao is its programming environment, which provides a powerful preprocessor (with an associated assertion language) capable of statically finding non-trivial bugs, verifying that programs comply with specifications, and performing many types of program optimizations. Such optimizations produce code that is highly competitive with other dynamic languages or, when the highest levéis of optimization are used, even that of static languages, all while retaining the interactive development environment of a dynamic language. The environment also includes a powerful auto-documenter. The paper provides an informal overview of the language and program development environment. It aims at illustrating the design philosophy rather than at being exhaustive, which would be impossible in the format of a paper, pointing instead to the existing literature on the system.