1000 resultados para SPECTROPHOTOMETRIC METHODS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis examines and explains the procedure used to redesign the attachment of permanent magnets to the surface of the rotor of a synchronous generator. The methodology followed to go from the actual assembly to converge to the final purposed innovation was based on the systematic approach design. This meant that first a series of steps had to be predefined as a frame of reference later to be used to compare and select proposals, and finally to obtain the innovation that was sought. Firstly, a series of patents was used as the background for the upcoming ideas. To this end, several different patented assemblies had been found and categorized according the main element onto which this thesis if focused, meaning the attachment element or method. After establishing the technological frame of reference, a brainstorm was performed to obtain as many ideas as possible. Then these ideas were classified, regardless of their degree of complexity or usability, since at this time the quantity of the ideas was the important issue. Subsequently, they were compared and evaluated from different points of view. The comparison and evaluation in this case was based on the use of a requirement list, which established the main needs that the design had to fulfill. Then the selection could be done by grading each idea in accordance with these requirements. In this way, one was able to obtain the idea or ideas that best fulfilled these requirements. Once all of the ideas were compared and evaluated, the best or most suitable idea or ideas were separated. Finally, the selected idea or ideas was/were analyzed in extension and a number of improvements were made. Consequently, a final idea was refined and made more suitable at its performance, manufacture, and life cycle assessment. Therefore, in the end, the design process gave a solution to the problem pointed out at the beginning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general striving to bring down the number of municipal landfills and to increase the reuse and recycling of waste-derived materials across the EU supports the debates concerning the feasibility and rationality of waste management systems. Substantial decrease in the volume and mass of landfill-disposed waste flows can be achieved by directing suitable waste fractions to energy recovery. Global fossil energy supplies are becoming more and more valuable and expensive energy sources for the mankind, and efforts to save fossil fuels have been made. Waste-derived fuels offer one potential partial solution to two different problems. First, waste that cannot be feasibly re-used or recycled is utilized in the energy conversion process according to EU’s Waste Hierarchy. Second, fossil fuels can be saved for other purposes than energy, mainly as transport fuels. This thesis presents the principles of assessing the most sustainable system solution for an integrated municipal waste management and energy system. The assessment process includes: · formation of a SISMan (Simple Integrated System Management) model of an integrated system including mass, energy and financial flows, and · formation of a MEFLO (Mass, Energy, Financial, Legislational, Other decisionsupport data) decision matrix according to the selected decision criteria, including essential and optional decision criteria. The methods are described and theoretical examples of the utilization of the methods are presented in the thesis. The assessment process involves the selection of different system alternatives (process alternatives for treatment of different waste fractions) and comparison between the alternatives. The first of the two novelty values of the utilization of the presented methods is the perspective selected for the formation of the SISMan model. Normally waste management and energy systems are operated separately according to the targets and principles set for each system. In the thesis the waste management and energy supply systems are considered as one larger integrated system with one primary target of serving the customers, i.e. citizens, as efficiently as possible in the spirit of sustainable development, including the following requirements: · reasonable overall costs, including waste management costs and energy costs; · minimum environmental burdens caused by the integrated waste management and energy system, taking into account the requirement above; and · social acceptance of the selected waste treatment and energy production methods. The integrated waste management and energy system is described by forming a SISMan model including three different flows of the system: energy, mass and financial flows. By defining the three types of flows for an integrated system, the selected factor results needed in the decision-making process of the selection of waste management treatment processes for different waste fractions can be calculated. The model and its results form a transparent description of the integrated system under discussion. The MEFLO decision matrix has been formed from the results of the SISMan model, combined with additional data, including e.g. environmental restrictions and regional aspects. System alternatives which do not meet the requirements set by legislation can be deleted from the comparisons before any closer numerical considerations. The second novelty value of this thesis is the three-level ranking method for combining the factor results of the MEFLO decision matrix. As a result of the MEFLO decision matrix, a transparent ranking of different system alternatives, including selection of treatment processes for different waste fractions, is achieved. SISMan and MEFLO are methods meant to be utilized in municipal decision-making processes concerning waste management and energy supply as simple, transparent and easyto- understand tools. The methods can be utilized in the assessment of existing systems, and particularly in the planning processes of future regional integrated systems. The principles of SISMan and MEFLO can be utilized also in other environments, where synergies of integrating two (or more) systems can be obtained. The SISMan flow model and the MEFLO decision matrix can be formed with or without any applicable commercial or free-of-charge tool/software. SISMan and MEFLO are not bound to any libraries or data-bases including process information, such as different emission data libraries utilized in life cycle assessments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plants constitute an excellent ecosystem for microorganisms. The environmental conditions offered differ considerably between the highly variable aerial plant part and the more stable root system. Microbes interact with plant tissues and cells with different degrees of dependence. The most interesting from the microbial ecology point of view, however, are specific interactions developed by plant-beneficial (either non-symbiotic or symbiotic) and pathogenic microorganisms. Plants, like humans and other animals, also become sick, but they have evolved a sophisticated defense response against microbes, based on a combination of constitutive and inducible responses which can be localized or spread throughout plant organs and tissues. The response is mediated by several messenger molecules that activate pathogen-responsive genes coding for enzymes or antimicrobial compounds, and produces less sophisticated and specific compounds than immunoglobulins in animals. However, the response specifically detects intracellularly a type of protein of the pathogen based on a gene-for-gene interaction recognition system, triggering a biochemical attack and programmed cell death. Several implications for the management of plant diseases are derived from knowledge of the basis of the specificity of plant-bacteria interactions. New biotechnological products are currently being developed based on stimulation of the plant defense response, and on the use of plant-beneficial bacteria for biological control of plant diseases (biopesticides) and for plant growth promotion (biofertilizers)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aikuispotilaan kotisyntyisen keuhkokuumeen etiologinen diagnostiikka mikrobiologisilla pikamenetelmillä Tausta. Keuhkokuume on vakava sairaus, johon sairastuu Suomessa vuosittain n. 60 000 aikuista. Huolimatta siitä, että taudin hoito on kehittynyt, siihen liittyy yhä merkittävä, 6-15%:n kuolleisuus. Alahengitystieinfektion aiheuttajamikrobien tunnistaminen on myös edelleen haasteellista. Tavoitteet. Tämän työn tavoitteena oli tutkia Turun yliopistollisessa keskussairaalassa hoidettujen aikuispotilaiden keuhkokuumeen etiologiaa sekä selvittää uusien mikrobiologisten pikamenetelmi¬en hyödyllisyyttä taudinaiheuttajan toteamisessa. Aineisto. Osatöiden I ja III aineisto koostui 384 Turun yliopistollisen keskussairaalaan infektio-osastolla hoidetusta keuhkokuumepotilaasta. Osatyössä I tutkittiin keuhkokuumeen aiheuttaja¬mikrobeja käyttämällä perinteisten menetelmien lisäksi antigeeniosoitukseen ja PCR-tekniikkaan perustuvia pikamenetelmiä. Osatyö II käsitti 231 potilaasta koostuvan alaryhmän, jossa tutkittiin potilaiden nielun limanäytteestä rinovirusten ja enterovirusten esiintyvyyttä. Osatyössä III potilailta tutkittiin plasman C-reaktiivisen proteiinin (CRP) pitoisuus ensimmäisten viiden sairaalahoitopäi¬vän aikana. Laajoja tilastotieteellisiä analyysejä käyttämällä selvitettiin CRP:n käyttökelpoisuutta sairauden vaikeusasteen arvioinnissa ja komplikaatioiden kehittymisen ennustamisessa. Osatyössä IV 68 keuhkokuumepotilaan sairaalaan tulovaiheessa otetuista näytteistä määritettiin neutrofiilien pintareseptorien ekspressio. Osatyössä V analysoitiin sisätautien vuodeosastoilla vuosina 1996-2000 keuhkokuumepotilaille tehtyjen keuhkohuuhtelunäytteiden laboratoriotutkimustulokset. Tulokset. Keuhkokuumeen aiheuttaja löytyi 209 potilaalta, aiheuttajamikrobeja löydettiin kaikkiaan 230. Näistä aiheuttajista 135 (58.7%) löydettiin antigeenin osoituksella tai PCR-menetelmillä. Suu¬rin osa, 95 (70.4%), todettiin pelkästään kyseisillä pikamenetelmillä. Respiratorinen virus todettiin antigeeniosoituksella 11.1% keuhkokuumepotilaalla. Eniten respiratorisia viruksia löytyi vakavaa keuhkokuumetta sairastavilta potilailta (20.3%). 231 keuhkokuumepotilaan alaryhmässä todettiin PCR-menetelmällä picornavirus 19 (8.2%) potilaalla. Respiratorinen virus löytyi tässä potilasryh¬mässä kaiken kaikkiaan 47 (20%) potilaalta. Näistä 17:llä (36%) löytyi samanaikaisesti bakteerin aiheuttama infektio. CRP-tasot olivat sairaalaan tulovaiheessa merkitsevästi korkeammat vakavaa keuhkokuumetta (PSI-luokat III-V) sairastavilla potilailla kuin lievää keuhkokuumetta (PSI-luokat I-II) sairastavilla potilailla (p <0.001). Yli 100 mg/l oleva CRP-taso neljän päivän kuluttua sairaa¬laan tulosta ennusti keuhkokuumeen komplikaatiota tai huonoa hoitovastetta. Neutrofiilien komple¬menttireseptorin ekspressio oli pneumokokin aiheuttamaa keuhkokuumetta sairastavilla merkitse¬västi korkeampi kuin influenssan aiheuttamaa keuhkokuumetta sairastavilla. BAL-näytteistä vain yhdessä 71:stä (1.3%) todettiin diagnostinen bakteerikasvu kvantitatiivisessa viljelyssä. Uusilla menetelmilläkin keuhkokuumeen aiheuttaja löytyi vain 9.8% BAL-näytteistä. Päätelmät. Uusilla antigeeniosoitus- ja PCR-menetelmillä keuhkokuumeen etiologia voidaan saada selvitettyä nopeasti. Lisäksi näitä menetelmiä käyttämällä taudin aiheuttajamikrobi löytyi huomattavasti suuremmalta osalta potilaista kuin pelkästään tavanomaisia menetelmiä käyttämällä. Pikamenetelmien hyödyllisyys vaihteli taudin vaikeusasteen mukaan. Respiratorinen virus löytyi huomattavan usein keuhkokuumetta sairastavilta potilailta, ja näiden potilaiden taudinkuva oli usein vaikea. Tulovaiheen korkeaa CRP-tasoa voidaan käyttää lisäkeinona arvioitaessa keuhkokuumeen vaikeutta. CRP on erityisen hyödyllinen arvioitaessa hoitovastetta ja riskiä komplikaatioiden ke¬hittymiseen. Neutrofiilien komplementtireseptorin ekspression tutkiminen näyttää lupaavalta pi¬kamenetelmältä erottamaan bakteerien ja virusten aiheuttamat taudit toisistaan. Antimikrobihoitoa saavilla potilailla BAL-tutkimuksen löydökset olivat vähäiset ja vaikuttivat hoitoon vain harvoin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considering the attraction of the students' attention by the changes in the colors of vegetable crude extracts caused by the variation of the pH of the medium, the use of these different colors in order to demonstrate principles of spectrophotometric acid-base titrations using the crude extracts as indicators is proposed. The experimental setup consisted of a simple spectrophotometer, a homemade flow cell and a pump to propel the fluids along the system. Students should be stimulated to choose the best wavelength to monitor the changes in color during the titration. Since the pH of the equivalence point depends on the system titrated, the wavelength must be properly chosen to follow these changes, demonstrating the importance of the correct choice of the indicator. When compared with the potentiometric results, errors as low as 2% could be found using Rhododendron simsii (azalea) or Tibouchina granulosa (Glory tree, quaresmeira) as sources of the crude extracts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mixed methods research involves the combined use of quantitative and qualitative methods in the same research study, and it is becoming increasingly important in several scientific areas. The aim of this paper is to review and compare through a mixed methods multiple-case study the application of this methodology in three reputable behavioural science journals: the Journal of Organizational Behavior, Addictive Behaviors and Psicothema. A quantitative analysis was carried out to review all the papers published in these journals during the period 2003-2008 and classify them into two blocks: theoretical and empirical, with the latter being further subdivided into three subtypes (quantitative, qualitative and mixed). A qualitative analysis determined the main characteristics of the mixed methods studies identified, in order to describe in more detail the ways in which the two methods are combined based on their purpose, priority, implementation and research design. From the journals selected, a total of 1.958 articles were analysed, the majority of which corresponded to empirical studies, with only a small number referring to research that used mixed methods. Nonetheless, mixed methods research does appear in all the behavioural science journals studied within the period selected, showing a range of designs, where the sequential equal weight mixed methods research design seems to stand out.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mixed methods research is becoming increasingly important in several scientific areas. The analysis of prevalence rates is a new line of research that has emerged in mixed methods research, and this methodological approach has only been applied carefully in a handful of journals. The purpose of this article was to analyse the prevalence of mixed methods research in interdisciplinary educational journals. Moreover, the main characteristics of the mixed methods articles identified were examined. This study used a mixed methods approach to analyse these aspects. Specifically, a partially mixed sequential equal status multiple-case study design was applied with a development mixed methods purpose. Three educational journals in different disciplines were reviewed from 2005 to 2010 (Academy of Management Learning and Education, Educational Psychology Review, Journal of the Learning Sciences). The findings show differences among the journals in the prevalence rates and characteristics of the mixed methods studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Technology scaling has proceeded into dimensions in which the reliability of manufactured devices is becoming endangered. The reliability decrease is a consequence of physical limitations, relative increase of variations, and decreasing noise margins, among others. A promising solution for bringing the reliability of circuits back to a desired level is the use of design methods which introduce tolerance against possible faults in an integrated circuit. This thesis studies and presents fault tolerance methods for network-onchip (NoC) which is a design paradigm targeted for very large systems-onchip. In a NoC resources, such as processors and memories, are connected to a communication network; comparable to the Internet. Fault tolerance in such a system can be achieved at many abstraction levels. The thesis studies the origin of faults in modern technologies and explains the classification to transient, intermittent and permanent faults. A survey of fault tolerance methods is presented to demonstrate the diversity of available methods. Networks-on-chip are approached by exploring their main design choices: the selection of a topology, routing protocol, and flow control method. Fault tolerance methods for NoCs are studied at different layers of the OSI reference model. The data link layer provides a reliable communication link over a physical channel. Error control coding is an efficient fault tolerance method especially against transient faults at this abstraction level. Error control coding methods suitable for on-chip communication are studied and their implementations presented. Error control coding loses its effectiveness in the presence of intermittent and permanent faults. Therefore, other solutions against them are presented. The introduction of spare wires and split transmissions are shown to provide good tolerance against intermittent and permanent errors and their combination to error control coding is illustrated. At the network layer positioned above the data link layer, fault tolerance can be achieved with the design of fault tolerant network topologies and routing algorithms. Both of these approaches are presented in the thesis together with realizations in the both categories. The thesis concludes that an optimal fault tolerance solution contains carefully co-designed elements from different abstraction levels

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The thermal stability of vegetable oils is an important factor that affects their quality. In this study, we investigated the thermal stability of oil and lecithin extracted from soybeans by two distinct processes: mechanical extraction (pressing) and physical extraction (solvent). Thermal analysis was used to obtain information about different methodologies of extraction. The physically extracted products proved more stable than those extracted mechanically. Raman and UV-Vis techniques were applied to underpin the discussion of process differences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Metaheuristic methods have become increasingly popular approaches in solving global optimization problems. From a practical viewpoint, it is often desirable to perform multimodal optimization which, enables the search of more than one optimal solution to the task at hand. Population-based metaheuristic methods offer a natural basis for multimodal optimization. The topic has received increasing interest especially in the evolutionary computation community. Several niching approaches have been suggested to allow multimodal optimization using evolutionary algorithms. Most global optimization approaches, including metaheuristics, contain global and local search phases. The requirement to locate several optima sets additional requirements for the design of algorithms to be effective in both respects in the context of multimodal optimization. In this thesis, several different multimodal optimization algorithms are studied in regard to how their implementation in the global and local search phases affect their performance in different problems. The study concentrates especially on variations of the Differential Evolution algorithm and their capabilities in multimodal optimization. To separate the global and local search search phases, three multimodal optimization algorithms are proposed, two of which hybridize the Differential Evolution with a local search method. As the theoretical background behind the operation of metaheuristics is not generally thoroughly understood, the research relies heavily on experimental studies in finding out the properties of different approaches. To achieve reliable experimental information, the experimental environment must be carefully chosen to contain appropriate and adequately varying problems. The available selection of multimodal test problems is, however, rather limited, and no general framework exists. As a part of this thesis, such a framework for generating tunable test functions for evaluating different methods of multimodal optimization experimentally is provided and used for testing the algorithms. The results demonstrate that an efficient local phase is essential for creating efficient multimodal optimization algorithms. Adding a suitable global phase has the potential to boost the performance significantly, but the weak local phase may invalidate the advantages gained from the global phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Strategic development of distribution networks plays a key role in the asset management in electricity distribution companies. Owing to the capital-intensive nature of the field and longspan operations of companies, the significance of a strategy is emphasised. A well-devised strategy combines awareness of challenges posed by the operating environment and the future targets of the distribution company. Economic regulation, ageing infrastructure, scarcity of resources and tightening supply requirements with challenges created by the climate change put a pressure on the strategy work. On the other hand, technology development related to network automation and underground cabling assists in answering these challenges. This dissertation aims at developing process knowledge and establishing a methodological framework by which key issues related to network development can be addressed. Moreover, the work develops tools by which the effects of changes in the operating environment on the distribution business can be analysed in the strategy work. To this end, the work discusses certain characteristics of the distribution business and describes the strategy process at a principle level. Further, the work defines the subtasks in the strategy process and presents the key elements in the strategy work and long-term network planning. The work delineates the factors having either a direct or indirect effect on strategic planning and development needs in the networks; in particular, outage costs constitute an important part of the economic regulation of the distribution business, reliability being thus a key driver in network planning. The dissertation describes the methodology and tools applied to cost and reliability analyses in the strategy work. The work focuses on determination of the techno-economic feasibility of different network development technologies; these feasibility surveys are linked to the economic regulation model of the distribution business, in particular from the viewpoint of reliability of electricity supply and allowed return. The work introduces the asset management system developed for research purposes and to support the strategy work, the calculation elements of the system and initial data used in the network analysis. The key elements of this asset management system are utilised in the dissertation. Finally, the study addresses the stages of strategic decision-making and compilation of investment strategies. Further, the work illustrates implementation of strategic planning in an actual distribution company environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stratospheric ozone can be measured accurately using a limb scatter remote sensing technique at the UV-visible spectral region of solar light. The advantages of this technique includes a good vertical resolution and a good daytime coverage of the measurements. In addition to ozone, UV-visible limb scatter measurements contain information about NO2, NO3, OClO, BrO and aerosols. There are currently several satellite instruments continuously scanning the atmosphere and measuring the UVvisible region of the spectrum, e.g., the Optical Spectrograph and Infrared Imager System (OSIRIS) launched on the Odin satellite in February 2001, and the Scanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY (SCIAMACHY) launched on Envisat in March 2002. Envisat also carries the Global Ozone Monitoring by Occultation of Stars (GOMOS) instrument, which also measures limb-scattered sunlight under bright limb occultation conditions. These conditions occur during daytime occultation measurements. The global coverage of the satellite measurements is far better than any other ozone measurement technique, but still the measurements are sparse in the spatial domain. Measurements are also repeated relatively rarely over a certain area, and the composition of the Earth’s atmosphere changes dynamically. Assimilation methods are therefore needed in order to combine the information of the measurements with the atmospheric model. In recent years, the focus of assimilation algorithm research has turned towards filtering methods. The traditional Extended Kalman filter (EKF) method takes into account not only the uncertainty of the measurements, but also the uncertainty of the evolution model of the system. However, the computational cost of full blown EKF increases rapidly as the number of the model parameters increases. Therefore the EKF method cannot be applied directly to the stratospheric ozone assimilation problem. The work in this thesis is devoted to the development of inversion methods for satellite instruments and the development of assimilation methods used with atmospheric models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A UV spectrophotometric method was developed and validated and a chromatographic method was adapted from the American Pharmacopeia for the analysis of Fluoxetine Hydrochloride capsules. Ethanol was used as solvent for the spectrophotometric method, with detection and determination at 276 nm. The separation for the chromatographic method was carried out using the reversed-phase column LC-8, triethylamine buffer, stabilizer free tetrahydrofuran and methanol (5:3.5:1.5), pH 6.0 as mobile phase and detection at 227 nm. The results obtained for both methods showed to be accurate, precise, robust and linear over the concentration range 100.00 - 300.00 µg/mL and 40.00 - 80.00 µg/mL of fluoxetine hydrochloride for the spectrophotometric and chromatographic methods, respectively. The accuracy of the methods was evaluated by a recovery test and showed results between 98.89 and 101.10%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The microbiological bioassay, UV-spectrophotometry and HPLC methods for assaying gatifloxacin in tablets were compared. Validation parameters such as linearity, precision, accuracy, limit of detection and limit of quantitation were determined. Beer's law was obeyed in the ranges 4.0-14.0 μg/mL for HPLC and UV-spectrophotometric method, and 4.0-16.0 μg/mL for bioassay. All methods were reliable within acceptable limits for antibiotic pharmaceutical preparations being accurate, precise and reproducible. The bioassay and HPLC are more specific than UV-spectrophotometric analysis. The application of each method as a routine analysis should be investigated considering cost, simplicity, equipment, solvents, speed, and application to large or small workloads.