962 resultados para Semi-infinite linear programming
Resumo:
The spatial distribution of self-employment in India: evidence from semiparametric geoadditive models, Regional Studies. The entrepreneurship literature has rarely considered spatial location as a micro-determinant of occupational choice. It has also ignored self-employment in developing countries. Using Bayesian semiparametric geoadditive techniques, this paper models spatial location as a micro-determinant of self-employment choice in India. The empirical results suggest the presence of spatial occupational neighbourhoods and a clear north–south divide in self-employment when the entire sample is considered; however, spatial variation in the non-agriculture sector disappears to a large extent when individual factors that influence self-employment choice are explicitly controlled. The results further suggest non-linear effects of age, education and wealth on self-employment.
Resumo:
In this letter, a nonlinear semi-analytical model (NSAM) for simulation of few-mode fiber transmission is proposed. The NSAM considers the mode mixing arising from the Kerr effect and waveguide imperfections. An analytical explanation of the model is presented, as well as simulation results for the transmission over a two mode fiber (TMF) of 112 Gb/s using coherently detected polarization multiplexed quadrature phase-shift-keying modulation. The simulations show that by transmitting over only one of the two modes on TMFs, long-haul transmission can be realized without increase of receiver complexity. For a 6000-km transmission link, a small modal dispersion penalty is observed in the linear domain, while a significant increase of the nonlinear threshold is observed due to the large core of TMF. © 2006 IEEE.
Resumo:
The paper presents a new network-flow interpretation of Łukasiewicz’s logic based on models with an increased effectiveness. The obtained results show that the presented network-flow models principally may work for multivalue logics with more than three states of the variables i.e. with a finite set of states in the interval from 0 to 1. The described models give the opportunity to formulate various logical functions. If the results from a given model that are contained in the obtained values of the arc flow functions are used as input data for other models then it is possible in Łukasiewicz’s logic to interpret successfully other sophisticated logical structures. The obtained models allow a research of Łukasiewicz’s logic with specific effective methods of the network-flow programming. It is possible successfully to use the specific peculiarities and the results pertaining to the function ‘traffic capacity of the network arcs’. Based on the introduced network-flow approach it is possible to interpret other multivalue logics – of E.Post, of L.Brauer, of Kolmogorov, etc.
Resumo:
Николай Кутев, Величка Милушева - Намираме експлицитно всичките би-омбилични фолирани полусиметрични повърхнини в четиримерното евклидово пространство R^4
Resumo:
This paper is on the use and performance of M-path polyphase Infinite Impulse Response (IIR) filters for channelisation, conventionally where Finite Impulse Response (FIR) filters are preferred. This paper specifically focuses on the Discrete Fourier Transform (DFT) modulated filter banks, which are known to be an efficient choice for channelisation in communication systems. In this paper, the low-pass prototype filter for the DFT filter bank has been implemented using an M-path polyphase IIR filter and we show that the spikes present at the stopband can be avoided by making use of the guardbands between narrowband channels. It will be shown that the channelisation performance will not be affected when polyphase IIR filters are employed instead of their counterparts derived from FIR prototype filters. Detailed complexity and performance analysis of the proposed use will be given in this article.
Resumo:
Recently, there has been considerable interest in solving viscoelastic problems in 3D particularly with the improvement in modern computing power. In many applications the emphasis has been on economical algorithms which can cope with the extra complexity that the third dimension brings. Storage and computer time are of the essence. The advantage of the finite volume formulation is that a large amount of memory space is not required. Iterative methods rather than direct methods can be used to solve the resulting linear systems efficiently.
Resumo:
La caractérisation détaillée de vastes territoires pose un défi de taille et est souvent limitée par les ressources disponibles et le temps. Les travaux de cette maîtrise s’incorporent au projet ParaChute qui porte sur le développement d’une Méthode québécoise d’Évaluation du Danger des Chutes de Pierres (MEDCP) le long d’infrastructures linéaires. Pour optimiser l’utilisation des ressources et du temps, une méthode partiellement automatisée facilitant la planification des travaux de terrain a été développée. Elle se base principalement sur la modélisation des trajectoires de chutes de pierres 3D pour mieux cibler les falaises naturelles potentiellement problématiques. Des outils d’automatisation ont été développés afin de permettre la réalisation des modélisations sur de vastes territoires. Les secteurs où l’infrastructure a le plus de potentiel d’être atteinte par d’éventuelles chutes de pierres sont identifiés à partir des portions de l’infrastructure les plus traversées par les trajectoires simulées. La méthode a été appliquée le long du chemin de fer de la compagnie ArcelorMittal Infrastructures Canada. Le secteur couvert par l’étude débute à une dizaine de kilomètres au nord de Port-Cartier (Québec) et s’étend sur 260 km jusqu’au nord des monts Groulx. La topographie obtenue de levés LiDAR aéroportés est utilisée afin de modéliser les trajectoires en 3D à l’aide du logiciel Rockyfor3D. Dans ce mémoire, une approche facilitant la caractérisation des chutes de pierres le long d’un tracé linéaire est présentée. Des études de trajectoires préliminaires sont réalisées avant les travaux sur le terrain. Les informations tirées de ces modélisations permettent de cibler les secteurs potentiellement problématiques et d’éliminer ceux qui ne sont pas susceptibles de générer des chutes de pierres avec le potentiel d’atteindre les éléments à risque le long de l’infrastructure linéaire.
Resumo:
Trabalho apresentado em PAEE/ALE’2016, 8th International Symposium on Project Approaches in Engineering Education (PAEE) and 14th Active Learning in Engineering Education Workshop (ALE)
Resumo:
Reconstructing the long-term evolution of organic sedimentation in the eastern Equatorial Atlantic (ODP Leg 159) provides information about the history of the climate/ocean system, sediment accumulation, and deposition of hydrocarbon-prone rocks. The recovery of a continuous, 1200 m long sequence at ODP Site 959 covering sediments from Albian (?) to the present day (about 120 Ma) makes this position a key location to study these aspects in a tropical oceanic setting. New high resolution carbon and pyrolysis records identify three main periods of enhanced organic carbon accumulation in the eastern tropical Atlantic, i.e. the late Cretaceous, the Eocene-Oligocene, and the Pliocene-Pleistocene. Formation of Upper Cretaceous black shales off West Africa was closely related to the tectonosedimentary evolution of the semi-isolated Deep Ivorian Basin north of the Côte d'Ivoire-Ghana Transform Margin. Their deposition was confined to certain intervals of the last two Cretaceous anoxic events, the early Turonian OAE2 and the Coniacian-Santonian OAE3. Organic geochemical characteristics of laminated Coniacian-Santonian shales reveal peak organic carbon concentrations of up to 17% and kerogen type I/II organic matter, which qualify them as excellent hydrocarbon source rocks, similar to those reported from other marginal and deep sea basins. A middle to late Eocene high productivity period occurred off equatorial West Africa. Porcellanites deposited during that interval show enhanced total organic carbon (TOC) accumulation and a good hydrocarbon potential associated with oil-prone kerogen. Deposition of these TOC-rich beds was likely related to a reversal in the deep-water circulation in the adjacent Sierra Leone Basin. Accordingly, outflow of old deep waters of Southern Ocean origin from the Sierra Leone Basin into the northern Gulf of Guinea favored upwelling of nutrient-enriched waters and simultaneously enhanced the preservation potential of sedimentary organic matter along the West African continental margin. A pronounced cyclicity in the carbon record of Oligocene-lower Miocene diatomite-chalk interbeds indicates orbital forcing of paleoceanographic conditions in the eastern Equatorial Atlantic since the Oligocene-Miocene transition. A similar control may date back to the early Oligocene but has to be confirmed by further studies. Latest Miocene-early Pliocene organic carbon deposition was closely linked to the evolution of the African trade winds, continental upwelling in the eastern Equatorial Atlantic, ocean chemistry and eustatic sea level fluctuations. Reduction in carbonate carbon preservation associated with enhanced carbon dissolution is recorded in the uppermost Miocene (5.82-5.2 Ma) section and suggests that the latest Miocene carbon record of Site 959 documents the influence of corrosive deep waters which formed in response to the Messinian Salinity Crisis. Furthermore, sea level-related displacement of higher productive areas towards the West African shelf edge is indicated at 5.65, 5.6, 5.55, 5.2, 4.8 Ma. In view of humid conditions in tropical Africa and a strong West African monsoonal system around the Miocene-Pliocene transition, the onset of pronounced TOC cycles at about 5.6 Ma marks the first establishment of upwelling cycles in the northern Gulf of Guinea. An amplification in organic carbon deposition at 3.3 Ma and 2.45 Ma links organic sedimentation in the tropical eastern Equatorial Atlantic to the main steps of northern hemisphere glaciation and testifies to the late Pliocene transition from humid to arid conditions in central and western African climate. Aridification of central Africa around 2.8 Ma is not clearly recorded at Site 959. However, decreased and highly fluctuating carbonate carbon concentrations are observed from 2.85 Ma on that may relate to enhanced terrigenous (eolian) dilution from Africa.
Resumo:
The ceramics industry in Piauí is nowadays with 55 industries where 11 are in Teresina which is the mainstream of the state, producing 55 million shingles; in which 10 % is of this production is wasted being sometimes thrown on the margins of rivers, roads and highways provoking an environmental degradation. The main goal of this work is to verify the potential of producing semi porous ceramic using grog of shingles, on the first part of this work bodies-of-proof were produced from a basic formula of an industry, doping it with 5 %, 10 %, 15 % and 20 % in mass and in the second part of this work some bodies-of-proof were produced from a formula where one raw material was substituted by 50 % of grog and another substituting it all by grog, bodies-of-proof made of a basic formula previously announced was used for experiment control.The grog and the raw materials were characterized by: particle size analysis , thermal differential analysis, X ray diffraction , X ray fluorescence, an thermal gravimetric analysis and rational analyses. The bodies-of-proof were sintetisized in an industrial oven obeying the normal cycle adopted by an industry, with peak temperatures of 1135 oC and a fast burning cycle of 25 minutes having as energetic fuel liquefied petroleum gas . The pieces that were obtained by this were submersed in rehearsed physics of: water absorption of, apparent specific mass, apparent porosity, lineal retraction, rupture tension to the flexural and dilatometry; mineralogical analysis for X ray diffraction; and microstructural for electronic microscope of sweeping. For all the formulas with addition of grog, superior priorities to the requested by the requirements for semi porous and for the formula to F2-2,5 superior priorities to standard formulas which justifies the incorporation of the shingles in mass for the semi porous ceramic
Resumo:
Several studies have suggested that differences in the natural rooting ability of plant cuttings could be attributed to differences in endogenous auxin levels. Hence, during rooting experiments, it is important to be able to routinely monitor the evolution of endogenous levels of plant hormones. This work reports the development of a new method for the quantification of free auxins in auxin-treated Olea europaea (L.) explants, using dispersive liquid–liquid microextraction (DLLME) and microwave assisted derivatization (MAD) followed by gas chromatography/mass spectrometry (GC/MS) analysis. Linear ranges of 0.5–500 ng mL 1 and 1–500 mg mL 1 were used for the quantification of indole-3-acetic acid (IAA) and indole-3-butyric acid (IBA), respectively. Determined by serial dilutions, the limits of detection (LOD) and quantification (LOQ) were 0.05 ng mL 1 and 0.25 ng mL 1, respectively for both compounds. When using the calibration curve for determination, the LOQ corresponded to 0.5 ng mL 1 (IAA) and 0.5 mg mL 1 (IBA). The proposed method proved to be substantially faster than other alternatives, and allowed free auxin quantification in real samples of semi-hardwood cuttings and microshoots of two olive cultivars. The concentrations found in the analyzed samples are in the range of 0.131–0.342 mg g 1 (IAA) and 20–264 mg g 1 (IBA).
Resumo:
A relevant problem of polyolefins processing is the presence of volatile and semi-volatile compounds (VOCs and SVOCs) such as linear chains alkanes found out in final products. These VOCs can be detected by customers from the unpleasant smelt and can be an environmental issue, at the same time they can cause negative side effects during process. Since no previously standardized analytical techniques for polymeric matrix are available in bibliography, we have implemented different VOCs extraction methods and gaschromatographic analysis for quali-quantitative studies of such compounds. In literature different procedures can be found including microwave extraction (MAE) and thermo desorption (TDS) used with different purposes. TDS coupled with GC-MS are necessary for the identification of different compounds in the polymer matrix. Although the quantitative determination is complex, the results obtained from TDS/GC-MS show that by-products are mainly linear chains oligomers with even number of carbon in a C8-C22 range (for HDPE). In order to quantify these linear alkanes by-products, a more accurate GC-FID determination with internal standard has been run on MAE extracts. Regardless the type of extruder used, it is difficult to distinguish the effect of the various processes, which in any case entails having a lower-boiling substance content, lower than the corresponding virgin polymer. The two HDPEs studied can be distinguished on the basis of the quantity of analytes found, therefore the production process is mainly responsible for the amount of VOCs and SVOCs observed. The extruder technology used by Sacmi SC allows to obtain a significant reduction in VOCs compared to the conventional screw system. Thus, the result is significantly important as a lower quantity of volatile substances certainly leads to a lower migration of such materials, especially when used for food packaging.
Resumo:
The study of ancient, undeciphered scripts presents unique challenges, that depend both on the nature of the problem and on the peculiarities of each writing system. In this thesis, I present two computational approaches that are tailored to two different tasks and writing systems. The first of these methods is aimed at the decipherment of the Linear A afraction signs, in order to discover their numerical values. This is achieved with a combination of constraint programming, ad-hoc metrics and paleographic considerations. The second main contribution of this thesis regards the creation of an unsupervised deep learning model which uses drawings of signs from ancient writing system to learn to distinguish different graphemes in the vector space. This system, which is based on techniques used in the field of computer vision, is adapted to the study of ancient writing systems by incorporating information about sequences in the model, mirroring what is often done in natural language processing. In order to develop this model, the Cypriot Greek Syllabary is used as a target, since this is a deciphered writing system. Finally, this unsupervised model is adapted to the undeciphered Cypro-Minoan and it is used to answer open questions about this script. In particular, by reconstructing multiple allographs that are not agreed upon by paleographers, it supports the idea that Cypro-Minoan is a single script and not a collection of three script like it was proposed in the literature. These results on two different tasks shows that computational methods can be applied to undeciphered scripts, despite the relatively low amount of available data, paving the way for further advancement in paleography using these methods.
Resumo:
There are many deformable objects such as papers, clothes, ropes in a person’s living space. To have a robot working in automating the daily tasks it is important that the robot works with these deformable objects. Manipulation of deformable objects is a challenging task for robots because these objects have an infinite-dimensional configuration space and are expensive to model, making real-time monitoring, planning and control difficult. It forms a particularly important field of robotics with relevant applications in different sectors such as medicine, food handling, manufacturing, and household chores. In this report, there is a clear review of the approaches used and are currently in use along with future developments to achieve this task. My research is more focused on the last 10 years, where I have systematically reviewed many articles to have a clear understanding of developments in this field. The main contribution is to show the whole landscape of this concept and provide a broad view of how it has evolved. I also explained my research methodology by following my analysis from the past to the present along with my thoughts for the future.
Resumo:
Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.