940 resultados para generative and performative modeling
Resumo:
The equilibria, the spectra and the identities of the species of Cr(VI) that are present in aqueous solution have long been an active subject of discussion in the literature. In this paper, three different chemometric methodologies are applied to sets of UV/Visible spectra of aqueous Cr(VI) solutions, in order to solve a chemical system where there is no available information concerning the composition of the samples nor spectral information about the pure species. Imbrie Q-mode factor analysis, followed by varimax rotation and Imbrie oblique projection, were used to estimate the composition of Cr(VI) equilibrium solutions and, by combining these results with the k-matrix method, to obtain the pure spectra of the species. Evolving factor analysis and self modeling curve resolution were used to confirm the number of the species and the resolution of the system, respectively. Sets of 3.3×10-4 and 3.3×10-5 mol L-1 Cr(VI) solutions, respectively, were analyzed in the pH range from 1 to 12. Two factors were identified, which were related to the chromate ion (CrO4(2-)) and bichromate ion (HCrO4-). The pK of the equilibrium was estimated as 5.8.
Resumo:
Recent years have produced great advances in the instrumentation technology. The amount of available data has been increasing due to the simplicity, speed and accuracy of current spectroscopic instruments. Most of these data are, however, meaningless without a proper analysis. This has been one of the reasons for the overgrowing success of multivariate handling of such data. Industrial data is commonly not designed data; in other words, there is no exact experimental design, but rather the data have been collected as a routine procedure during an industrial process. This makes certain demands on the multivariate modeling, as the selection of samples and variables can have an enormous effect. Common approaches in the modeling of industrial data are PCA (principal component analysis) and PLS (projection to latent structures or partial least squares) but there are also other methods that should be considered. The more advanced methods include multi block modeling and nonlinear modeling. In this thesis it is shown that the results of data analysis vary according to the modeling approach used, thus making the selection of the modeling approach dependent on the purpose of the model. If the model is intended to provide accurate predictions, the approach should be different than in the case where the purpose of modeling is mostly to obtain information about the variables and the process. For industrial applicability it is essential that the methods are robust and sufficiently simple to apply. In this way the methods and the results can be compared and an approach selected that is suitable for the intended purpose. Differences in data analysis methods are compared with data from different fields of industry in this thesis. In the first two papers, the multi block method is considered for data originating from the oil and fertilizer industries. The results are compared to those from PLS and priority PLS. The third paper considers applicability of multivariate models to process control for a reactive crystallization process. In the fourth paper, nonlinear modeling is examined with a data set from the oil industry. The response has a nonlinear relation to the descriptor matrix, and the results are compared between linear modeling, polynomial PLS and nonlinear modeling using nonlinear score vectors.
Resumo:
This work reports a review on the status and technical feasibility of the application of ethanol as fuel for Solid Oxide Fuel Cells (SOFC), presenting both external reform and cell with direct utilization of ethanol. Based on this survey, both experimental results and mathematical modeling indicated the technical feasibility of power generation by ethanol SOFC, with cell units producing 450 mW/cm², sufficient for scale up to large stationary plants. The quantitative assessments in the literature show this field to be promising for researchers and private sector investment as well being a strategic technology for government policy in the short and long term.
Resumo:
Selective papers of the workshop on "Development of models and forest soil surveys for monitoring of soil carbon", Koli, Finland, April 5-9 2006.
Resumo:
Agile software development has grown in popularity starting from the agile manifesto declared in 2001. However there is a strong belief that the agile methods are not suitable for embedded, critical or real-time software development, even though multiple studies and cases show differently. This thesis will present a custom agile process that can be used in embedded software development. The reasons for presumed unfitness of agile methods in embedded software development have mainly based on the feeling of these methods providing no real control, no strict discipline and less rigor engineering practices. One starting point is to provide a light process with disciplined approach to the embedded software development. Agile software development has gained popularity due to the fact that there are still big issues in software development as a whole. Projects fail due to schedule slips, budget surpassing or failing to meet the business needs. This does not change when talking about embedded software development. These issues are still valid, with multiple new ones rising from the quite complex and hard domain the embedded software developers work in. These issues are another starting point for this thesis. The thesis is based heavily on Feature Driven Development, a software development methodology that can be seen as a runner up to the most popular agile methodologies. The FDD as such is quite process oriented and is lacking few practices considered commonly as extremely important in agile development methodologies. In order for FDD to gain acceptance in the software development community it needs to be modified and enhanced. This thesis presents an improved custom agile process that can be used in embedded software development projects with size varying from 10 to 500 persons. This process is based on Feature Driven Development and by suitable parts to Extreme Programming, Scrum and Agile Modeling. Finally this thesis will present how the new process responds to the common issues in the embedded software development. The process of creating the new process is evaluated at the retrospective and guidelines for such process creation work are introduced. These emphasize the agility also in the process development through early and frequent deliveries and the team work needed to create suitable process.
Resumo:
Earlier research has shown that strong experiences related to music (SEM) can occur in very different contexts and take on many different forms. Experiences like these seem significant and have among other things reportedly had an affect on the individual's continuing relationship towards music, which makes them interesting from a pedagogical point of view. Formal teaching situations, though, are under-represented in studies where people have been asked to describe strong experiences that they have had in connection with music. The purpose of my thesis is to investigate what SEM may mean to pupils and teachers in lower secondary school (grades 7-9), and to inquire more deeply into the potential "space" for such experiences within school music education. On a comprehensive level my ambition is to deepen the understanding for SEM as a possible element in pedagogical situations. Three empirical perspectives are employed: pupil-, teacher- and curriculum perspectives. The pupil perspective involved an analysis of written accounts of 166 fifteen-year-olds, describing own strong experiences. The teacher perspective involved studying 28 music teachers' conceptions of the purpose of teaching music in school as well as their understanding about strong music experiences in school context. Further, the teachers' descriptions of SEM that they have had themselves were analysed. The curriculum perspective is reflected through a study of how music experience was represented in 24 local and 2 national curriculum texts for music. Grounded in a phenomenological-hermeneutical perspective the material have been analyzed both qualitatively and quantitatively. The result points towards the fact that the music education in school has the potential to become an arena for SEM and that this can happen in relation to a multitude of activities and genres and take on many different expressions. Only one pupil referred to a musical encounter in the classroom environment; all other experiences that occurred inside the frame of school activity had taken place in other arenas (the school hall, public concert halls, and so on). However, more than 98 % of the descriptions concerned musical encounters in leisure time contexts. The significance of SEM is further clarified by narrative constructions. SEM as a conception does not occur on the curriculum level; however the analyze revealed a number of interesting "openings" which are illustrated. Even though all teachers displayed a fundamentally positive attitude towards the idea of regarding SEM as a feature of formal musical learning, it became clear that many teachers never had approached this theme from a pedagogical point of view before. Still, they proved to have an evident "familiarity" towards the phenomenon based on their own experiences of receptive and performative musical encounter. The possible space for strong musical experiences within school music education is specified through a detailed illustration of six specific themes derived from the reasoning of the teachers. Furthermore, this is described through a mapping of the potential experiencing zone, constructed from the teachers descriptions of educational aims.
Resumo:
Preference relations, and their modeling, have played a crucial role in both social sciences and applied mathematics. A special category of preference relations is represented by cardinal preference relations, which are nothing other than relations which can also take into account the degree of relation. Preference relations play a pivotal role in most of multi criteria decision making methods and in the operational research. This thesis aims at showing some recent advances in their methodology. Actually, there are a number of open issues in this field and the contributions presented in this thesis can be grouped accordingly. The first issue regards the estimation of a weight vector given a preference relation. A new and efficient algorithm for estimating the priority vector of a reciprocal relation, i.e. a special type of preference relation, is going to be presented. The same section contains the proof that twenty methods already proposed in literature lead to unsatisfactory results as they employ a conflicting constraint in their optimization model. The second area of interest concerns consistency evaluation and it is possibly the kernel of the thesis. This thesis contains the proofs that some indices are equivalent and that therefore, some seemingly different formulae, end up leading to the very same result. Moreover, some numerical simulations are presented. The section ends with some consideration of a new method for fairly evaluating consistency. The third matter regards incomplete relations and how to estimate missing comparisons. This section reports a numerical study of the methods already proposed in literature and analyzes their behavior in different situations. The fourth, and last, topic, proposes a way to deal with group decision making by means of connecting preference relations with social network analysis.
Resumo:
Kirjallisuusarvostelu
Resumo:
This dissertation examines knowledge and industrial knowledge creation processes. It looks at the way knowledge is created in industrial processes based on data, which is transformed into information and finally into knowledge. In the context of this dissertation the main tool for industrial knowledge creation are different statistical methods. This dissertation strives to define industrial statistics. This is done using an expert opinion survey, which was sent to a number of industrial statisticians. The survey was conducted to create a definition for this field of applied statistics and to demonstrate the wide applicability of statistical methods to industrial problems. In this part of the dissertation, traditional methods of industrial statistics are introduced. As industrial statistics are the main tool for knowledge creation, the basics of statistical decision making and statistical modeling are also included. The widely known Data Information Knowledge Wisdom (DIKW) hierarchy serves as a theoretical background for this dissertation. The way that data is transformed into information, information into knowledge and knowledge finally into wisdom is used as a theoretical frame of reference. Some scholars have, however, criticized the DIKW model. Based on these different perceptions of the knowledge creation process, a new knowledge creation process, based on statistical methods is proposed. In the context of this dissertation, the data is a source of knowledge in industrial processes. Because of this, the mathematical categorization of data into continuous and discrete types is explained. Different methods for gathering data from processes are clarified as well. There are two methods for data gathering in this dissertation: survey methods and measurements. The enclosed publications provide an example of the wide applicability of statistical methods in industry. In these publications data is gathered using surveys and measurements. Enclosed publications have been chosen so that in each publication, different statistical methods are employed in analyzing of data. There are some similarities between the analysis methods used in the publications, but mainly different methods are used. Based on this dissertation the use of statistical methods for industrial knowledge creation is strongly recommended. With statistical methods it is possible to handle large datasets and different types of statistical analysis results can easily be transformed into knowledge.
Resumo:
In this Master’s thesis agent-based modeling has been used to analyze maintenance strategy related phenomena. The main research question that has been answered was: what does the agent-based model made for this study tell us about how different maintenance strategy decisions affect profitability of equipment owners and maintenance service providers? Thus, the main outcome of this study is an analysis of how profitability can be increased in industrial maintenance context. To answer that question, first, a literature review of maintenance strategy, agent-based modeling and maintenance modeling and optimization was conducted. This review provided the basis for making the agent-based model. Making the model followed a standard simulation modeling procedure. With the simulation results from the agent-based model the research question was answered. Specifically, the results of the modeling and this study are: (1) optimizing the point in which a machine is maintained increases profitability for the owner of the machine and also the maintainer with certain conditions; (2) time-based pricing of maintenance services leads to a zero-sum game between the parties; (3) value-based pricing of maintenance services leads to a win-win game between the parties, if the owners of the machines share a substantial amount of their value to the maintainers; and (4) error in machine condition measurement is a critical parameter to optimizing maintenance strategy, and there is real systemic value in having more accurate machine condition measurement systems.
Resumo:
Wastes and side streams in the mining industry and different anthropogenic wastes often contain valuable metals in such concentrations their recovery may be economically viable. These raw materials are collectively called secondary raw materials. The recovery of metals from these materials is also environmentally favorable, since many of the metals, for example heavy metals, are hazardous to the environment. This has been noticed in legislative bodies, and strict regulations for handling both mining and anthropogenic wastes have been developed, mainly in the last decade. In the mining and metallurgy industry, important secondary raw materials include, for example, steelmaking dusts (recoverable metals e.g. Zn and Mo), zinc plant residues (Ag, Au, Ga, Ge, In) and waste slurry from Bayer process alumina production (Ga, REE, Ti, V). From anthropogenic wastes, waste electrical and electronic equipment (WEEE), among them LCD screens and fluorescent lamps, are clearly the most important from a metals recovery point of view. Metals that are commonly recovered from WEEE include, for example, Ag, Au, Cu, Pd and Pt. In LCD screens indium, and in fluorescent lamps, REEs, are possible target metals. Hydrometallurgical processing routes are highly suitable for the treatment of complex and/or low grade raw materials, as secondary raw materials often are. These solid or liquid raw materials often contain large amounts of base metals, for example. Thus, in order to recover valuable metals, with small concentrations, highly selective separation methods, such as hydrometallurgical routes, are needed. In addition, hydrometallurgical processes are also seen as more environmental friendly, and they have lower energy consumption, when compared to pyrometallurgical processes. In this thesis, solvent extraction and ion exchange are the most important hydrometallurgical separation methods studied. Solvent extraction is a mainstream unit operation in the metallurgical industry for all kinds of metals, but for ion exchange, practical applications are not as widespread. However, ion exchange is known to be particularly suitable for dilute feed solutions and complex separation tasks, which makes it a viable option, especially for processing secondary raw materials. Recovering valuable metals was studied with five different raw materials, which included liquid and solid side streams from metallurgical industries and WEEE. Recovery of high purity (99.7%) In, from LCD screens, was achieved by leaching with H2SO4, extracting In and Sn to D2EHPA, and selectively stripping In to HCl. In was also concentrated in the solvent extraction stage from 44 mg/L to 6.5 g/L. Ge was recovered as a side product from two different base metal process liquors with Nmethylglucamine functional chelating ion exchange resin (IRA-743). Based on equilibrium and dynamic modeling, a mechanism for this moderately complex adsorption process was suggested. Eu and Y were leached with high yields (91 and 83%) by 2 M H2SO4 from a fluorescent lamp precipitate of waste treatment plant. The waste also contained significant amounts of other REEs such as Gd and Tb, but these were not leached with common mineral acids in ambient conditions. Zn was selectively leached over Fe from steelmaking dusts with a controlled acidic leaching method, in which the pH did not go below, but was held close as possible to, 3. Mo was also present in the other studied dust, and was leached with pure water more effectively than with the acidic methods. Good yield and selectivity in the solvent extraction of Zn was achieved by D2EHPA. However, Fe needs to be eliminated in advance, either by the controlled leaching method or, for example, by precipitation. 100% Pure Mo/Cr product was achieved with quaternary ammonium salt (Aliquat 336) directly from the water leachate, without pH adjustment (pH 13.7). A Mo/Cr mixture was also obtained from H2SO4 leachates with hydroxyoxime LIX 84-I and trioctylamine (TOA), but the purities were 70% at most. However with Aliquat 336, again an over 99% pure mixture was obtained. High selectivity for Mo over Cr was not achieved with any of the studied reagents. Ag-NaCl solution was purified from divalent impurity metals by aminomethylphosphonium functional Lewatit TP-260 ion exchange resin. A novel preconditioning method, named controlled partial neutralization, with conjugate bases of weak organic acids, was used to control the pH in the column to avoid capacity losses or precipitations. Counter-current SMB was shown to be a better process configuration than either batch column operation or the cross-current operation conventionally used in the metallurgical industry. The raw materials used in this thesis were also evaluated from an economic point of view, and the precipitate from a waste fluorescent lamp treatment process was clearly shown to be the most promising.
Resumo:
Adaptive control systems are one of the most significant research directions of modern control theory. It is well known that every mechanical appliance’s behavior noticeably depends on environmental changes, functioning-mode parameter changes and changes in technical characteristics of internal functional devices. An adaptive controller involved in control process allows reducing an influence of such changes. In spite of this such type of control methods is applied seldom due to specifics of a controller designing. The work presented in this paper shows the design process of the adaptive controller built by Lyapunov’s function method for the Hydraulic Drive. The calculation needed and the modeling were conducting with MATLAB® software including Simulink® and Symbolic Math Toolbox™ etc. In the work there was applied the Jacobi matrix linearization of the object’s mathematical model and derivation of the suitable reference models based on Newton’s characteristic polynomial. The intelligent adaptive to nonlinearities algorithm for solving Lyapunov’s equation was developed. Developed algorithm works properly but considered plant is not met requirement of functioning with. The results showed confirmation that adaptive systems application significantly increases possibilities in use devices and might be used for correction a system’s behavior dynamics.
Resumo:
In the theoretical part of this master’s thesis business process management and process modeling and the benefits and challenges related are discussed. The empirical part consists of two parts. The first one is a case study about process management and modeling and the second one presents the outcomes from a modeling project that was executed for the employer of this thesis. In the project the target mill unit’s business processes were identified, a process map was conducted and process architecture for further use was established. In the outcomes of the study challenges and possibilities of process management and modeling as well as possible reasons for these prejudices are discussed based on the case study and the modeling project. From the research of this thesis a framework for successful process modeling is established. The framework highlights four most important sectors that an organization should evaluate before and during the modeling project.
Resumo:
This project is a deconstructive discourse analysis of smart girlhood. From a feminist post structural framework, with a focus on discourse and performative identity, I scrutinize three dominant discourses of smartness that are prevalent and academic and popular press. These constructions frame smart girls as being either Losers, Have-It-All Girls, or Imposters. By conducting semi-structured group interviews with six self-identified smart girls, I explore the question of how smart girls perform their smart girl identities in their current sociocultural context. After analyzing the data from the group interviews, I outline five themes that seem to be prevalent in the stories told by the smart girls in this thesis. Finally, I discuss how the performative identities of the smart girls in my thesis appear to be much more complex, multiple and rhizomatic than the discourses under review allow.
Resumo:
This study has two main objectives. First, the phlebotomy process at the St. Catharines Site of the Niagara Health System is investigated, which starts when an order for a blood test is placed, and ends when the specimen arrives at the lab. The performance measurement is the flow time of the process, which reflects concerns and interests of both the hospital and the patients. Three popular operational methodologies are applied to reduce the flow time and improve the process: DMAIC from Six Sigma, lean principles and simulation modeling. Potential suggestions are provided for the St. Catharines Site, which could result in an average of seven minutes reduction in the flow time. The second objective addresses the fact that these three methodologies have not been combined before in a process improvement effort. A structured framework combining them is developed to benefit future study of phlebotomy and other hospital processes.