926 resultados para complex polymerization method
Resumo:
The changing business environment demands that chemical industrial processes be designed such that they enable the attainment of multi-objective requirements and the enhancement of innovativedesign activities. The requirements and key issues for conceptual process synthesis have changed and are no longer those of conventional process design; there is an increased emphasis on innovative research to develop new concepts, novel techniques and processes. A central issue, how to enhance the creativity of the design process, requires further research into methodologies. The thesis presentsa conflict-based methodology for conceptual process synthesis. The motivation of the work is to support decision-making in design and synthesis and to enhance the creativity of design activities. It deals with the multi-objective requirements and combinatorially complex nature of process synthesis. The work is carriedout based on a new concept and design paradigm adapted from Theory of InventiveProblem Solving methodology (TRIZ). TRIZ is claimed to be a `systematic creativity' framework thanks to its knowledge based and evolutionary-directed nature. The conflict concept, when applied to process synthesis, throws new lights on design problems and activities. The conflict model is proposed as a way of describing design problems and handling design information. The design tasks are represented as groups of conflicts and conflict table is built as the design tool. The general design paradigm is formulated to handle conflicts in both the early and detailed design stages. The methodology developed reflects the conflict nature of process design and synthesis. The method is implemented and verified through case studies of distillation system design, reactor/separator network design and waste minimization. Handling the various levels of conflicts evolve possible design alternatives in a systematic procedure which consists of establishing an efficient and compact solution space for the detailed design stage. The approach also provides the information to bridge the gap between the application of qualitative knowledge in the early stage and quantitative techniques in the detailed design stage. Enhancement of creativity is realized through the better understanding of the design problems gained from the conflict concept and in the improvement in engineering design practice via the systematic nature of the approach.
Resumo:
Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.
Resumo:
In cases of ligature strangulation, the importance of distinguishing self-inflicted death from homicide is crucial. This entails objective scene investigation, autopsy and anamnesis in order to elucidate the manner of death correctly. The authors report a case of unplanned complex suicide by means of self-strangulation and multiple sharp force injury. The use of more than one suicide method, consecutively--termed unplanned complex suicide--gives this case particular significance. A brief discussion on this uncommon method of suicide is presented, particularly relevant to the attending forensic physician. In addition, a short overview of the entity of complex suicide is given.
Resumo:
Tässä työssä tutkittiin erilaisten sisäisten donorien vaikutusta polypropeenin ominaisuuksiin käytettäessä Ziegler-Natta-katalyyttiä, joka valmistettiin Borealiksen aiemmin kehittämällä kaksifaasimenetelmällä. Tällä uudella menetelmällä katalyytti voidaan valmistaa ilman lisättyä sisäistä donoria ja kantajaa. Katalyyttihiukkaset saadaan kaksifaasisysteemin ansiosta muodoltaan pyöreiksi. Työn kokeellisessa osassa valmistettiin erilaisia Mg-komplekseja, jossa sisäinen donori muodostuu in-situ alkoholin ja karboksyylihappokloridin reagoidessa keskenään. Katalyyttisynteesissä Mg-kompleksi reagoi TiCl4:n kanssa. Saatujen katalyyttien ominaisuuksia testattiin polymeroimalla niillä propeenia 70 °C:ssa tunnin ajan. Polymeerien ominaisuuksia tutkittiin useiden eri karakterisointimenetelmien avulla. Lisäksi tutkittiin mahdollisuutta valmistaa katalyytti, joka ei sisältäisi ftalaattia. Työssä havaittiin, että katalyytin valmistusmenetelmä on käyttökelpoinen myös muilla sisäisillä donoreilla kuin referenssinä käytetyllä DOP:lla. Kaksiliuosfaasi-systeemi saatiin aikaan myös kahdella muulla työssä tutkitulla sisäisellä donorilla. Lisäksi faasitasapainokokeissa kahden liuosfaasin systeemi saatiin aikaan sisäisellä donorilla, joka ei sisältänyt ftalaattia. Kyseisellä katalyytillä havaittiin olevan muista katalyyteistä poikkeavia ominaisuuksia. Esimerkiksi se antoi matalamman isotaktisuuden kuin referenssikatalyytti ja se saattaisikin soveltua matalan isotaktisuuden pehmeille tuotteille. Työssä kokeiltiin yhdellä uudella katalyytillä myös eteenin polymerointia, sillä katalyytin donoripitoisuus oli hyvin matala. Katalyytin aktiivisuus eteenipolymeroinnissa oli varsin hyvä.
Resumo:
The objective of the thesis was to create a framework that can be used to define a manufacturing strategy taking advantage of the product life cycle method, which enables PQP enhancements. The starting point was to study synkron implementation of cost leadership and differentiation strategies in different stages of the life cycles. It was soon observed that Porter’s strategies were too generic for the complex and dynamic environment where customer needs deviate market and product specifically. Therefore, the strategy formulation process is based on the Terry Hill’s order-winner and qualifier concepts. The manufacturing strategy formulation is initiated with the definition of order-winning and qualifying criteria. From these criteria there can be shaped product specific proposals for action and production site specific key manufacturing tasks that they need to answer in order to meet customers and markets needs. As a future research it is suggested that the process of capturing order-winners and qualifiers should be developed so that the process would be simple and streamlined at Wallac Oy. In addition, defined strategy process should be integrated to the PerkinElmer’s SGS process. SGS (Strategic Goal Setting) is one of the PerkinElmer’s core management processes. Full Text: Null
Resumo:
AIM: The study aimed to compare the rate of success and cost of anal fistula plug (AFP) insertion and endorectal advancement flap (ERAF) for anal fistula. METHOD: Patients receiving an AFP or ERAF for a complex single fistula tract, defined as involving more than a third of the longitudinal length of of the anal sphincter, were registered in a prospective database. A regression analysis was performed of factors predicting recurrence and contributing to cost. RESULTS: Seventy-one patients (AFP 31, ERAF 40) were analysed. Twelve (39%) recurrences occurred in the AFP and 17 (43%) in the ERAF group (P = 1.00). The median length of stay was 1.23 and 2.0 days (P < 0.001), respectively, and the mean cost of treatment was euro5439 ± euro2629 and euro7957 ± euro5905 (P = 0.021), respectively. On multivariable analysis, postoperative complications, underlying inflammatory bowel disease and fistula recurring after previous treatment were independent predictors of de novo recurrence. It also showed that length of hospital stay ≤ 1 day to be the most significant independent contributor to lower cost (P = 0.023). CONCLUSION: Anal fistula plug and ERAF were equally effective in treating fistula-in-ano, but AFP has a mean cost saving of euro2518 per procedure compared with ERAF. The higher cost for ERAF is due to a longer median length of stay.
Resumo:
This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.
Resumo:
BACKGROUND: Abdominoperineal resection (APR) following radiotherapy is associated with a high rate of perineal wound complications. The anterolateral thigh (ALT) flap, combined with the vastus lateralis (VL) muscle, can cover complex perineal and pelvic anteroposterior defects. This is used for the first time transabdominally through the pelvis and the perineum (TAPP) in the infero-posterior directions; this technique has been described and illustrated in this study. METHODS: Among over 90 patients who underwent perineal reconstruction between May 2004 and June 2011, six patients presented high-grade tumours invading perineum, pelvis and sacrum, thereby resulting in a continuous anteroposterior defect. ALT + VL TAPP reconstructions were performed after extended APR and, subsequently, sacrectomy. Patients were examined retrospectively to determine demographics, operative time, complications (general and flap-related), time to complete healing and length of hospital stay. Long-term flap coverage, flap volume stability and functional and aesthetic outcomes were assessed. RESULTS: Mean operating time of the reconstruction was 290 min. No deaths occurred. One patient presented partial flap necrosis. Another patient presented a novel wound dehiscence after flap healing, due to secondary skin dissemination of the primary tumour. Following volumetric flap analysis on serial post-operative CT scans, no significant flap atrophy was observed. All flaps fully covered the defects. No late complications such as fistulas or perineal hernias occurred. Donor-site recovery was uneventful with no functional deficits. CONCLUSIONS: The use of the ALT + VL flap transabdominally is an innovative method to reconstruct exceptionally complex perineal and pelvic defects extending up to the lower back. This flap guarantees superior bulk, obliterating all pelvic dead space, with the fascia lata (FL) supporting the pelvic floor.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation
Resumo:
There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.
Resumo:
Needle trap devices (NTDs) are a relatively new and promising tool for headspace (HS) analysis. In this study, a dynamic HS sampling procedure is evaluated for the determination of volatile organic compounds (VOCs) in whole blood samples. A full factorial design was used to evaluate the influence of the number of cycles and incubation time and it is demonstrated that the controlling factor in the process is the number of cycles. A mathematical model can be used to determine the most appropriate number of cycles required to adsorb a prefixed amount of VOCs present in the HS phase whenever quantitative adsorption is reached in each cycle. Matrix effect is of great importance when complex biological samples, such as blood, are analyzed. The evaluation of the salting out effect showed a significant improvement in the volatilization of VOCs to the HS in this type of matrices. Moreover, a 1:4 (blood:water) dilution is required to obtain quantitative recoveries of the target analytes when external calibration is used. The method developed gives detection limits in the 0.020–0.080 μg L−1 range (0.1–0.4 μg L−1 range for undiluted blood samples) with appropriate repeatability values (RSD < 15% at high level and <23% at LOQ level). Figure of merits of the method can be improved by using a smaller phase ratio (i.e., an increase in the blood volume and a decrease in the HS volume), which lead to lower detection limits, better repeatability values and greater sensibility. Twenty-eight blood samples have been evaluated with the proposed method and the results agree with those indicated in other studies. Benzene was the only target compound that gave significant differences between blood levels detected in volunteer non-smokers and smokers
Resumo:
Complexation between acyclovir (ACV), an antiviral drug used for the treatment of herpes simplex virus infection, and beta-cyclodextrin (beta-CD) was studied in solution and in solid states. Complexation in solution was evaluated using solubility studies and nuclear magnetic resonance spectroscopy (¹H-NMR). In the solid state, X-ray diffraction, differential scanning calorimetry (DSC), thermal gravimetric analysis (TGA) and dissolution studies were used. Solubility studies suggested the existence of a 1:1 complex between ACV and beta-CD. ¹H-NMR spectroscopy studies showed that the complex formed occurs with a stoichiometry ratio of 1:1. Powder X-ray diffraction indicated that ACV exists in a semicrystalline state in the complexed form with beta-CD. DSC studies showed the existence of a complex of ACV with beta-CD. The TGA studies confirmed the DSC results of the complex. Solubility of ACV in solid complexes was studied by the dissolution method and it was found to be much more soluble than the uncomplexed drug.
Resumo:
The spectrophotometric determination of Cd(II) using a flow injection system provided with a solid-phase reactor for cadmium preconcentration and on-line reagent preparation, is described. It is based on the formation of a dithizone-Cd complex in basic medium. The calibration curve is linear between 6 and 300 µg L-1 Cd(II), with a detection limit of 5.4 µg L-1, an RSD of 3.7% (10 replicates in duplicate) and a sample frequency of 11.4 h-1. The proposed method was satisfactorily applied to the determination of Cd(II) in surface, well and drinking waters.
Resumo:
A flow injection method for the quantitative analysis of ketoconazole in tablets, based on the reaction with iron (III) ions, is presented. Ketoconazole forms a red complex with iron ions in an acid medium, with maximum absorbance at 495 nm. The detection limit was estimated to be 1×10--4 mol L-1; the quantitation limit is about 3×10--4 mol L-1 and approximately 30 determinations can be performed in an hour. The results were compared with those obtained with a reference HPLC method. Statistical comparisons were done using the Student's t procedure and the F test. Complete agreement was found at the 0.95 significance level between the proposed flow injection and the HPLC procedures. The two methods present similar precision, i.e., for HPLC the mean relative standard deviation was ca. 1.2% and for FIA ca. 1.6%.
Resumo:
Cefdinir has broad spectrum of activity and high prescription rates, hence its counterfeiting seems imminent. We have proposed a simple, fast, selective and non-extractive spectrophotometric method for the content assay of cefdinir in formulations. The method is based on complexation of cefdinir and Fe under reducing condition in a buffered medium (pH 11) to form a magenta colored donor-acceptor complex (λ max = 550 nm; apparent molar absorptivity = 3720 L mol-1 cm-1). No other cephalosporins, penicillins and common excipients interfere under the test conditions. The Beer's law is followed in the concentration range 8-160 µg mL-1.