970 resultados para Univariate Analysis box-jenkins methodology
Resumo:
Nowadays, the upwind three bladed horizontal axis wind turbine is the leading player on the market. It has been found to be the best industrial compromise in the range of different turbine constructions. The current wind industry innovation is conducted in the development of individual turbine components. The blade constitutes 20-25% of the overall turbine budget. Its optimal operation in particular local economic and wind conditions is worth investigating. The blade geometry, namely the chord, twist and airfoil type distributions along the span, responds to the output measures of the blade performance. Therefore, the optimal wind blade geometry can improve the overall turbine performance. The objectives of the dissertation are focused on the development of a methodology and specific tool for the investigation of possible existing wind blade geometry adjustments. The novelty of the methodology presented in the thesis is the multiobjective perspective on wind blade geometry optimization, particularly taking simultaneously into account the local wind conditions and the issue of aerodynamic noise emissions. The presented optimization objective approach has not been investigated previously for the implementation in wind blade design. The possibilities to use different theories for the analysis and search procedures are investigated and sufficient arguments derived for the usage of proposed theories. The tool is used for the test optimization of a particular wind turbine blade. The sensitivity analysis shows the dependence of the outputs on the provided inputs, as well as its relative and absolute divergences and instabilities. The pros and cons of the proposed technique are seen from the practical implementation, which is documented in the results, analysis and conclusion sections.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
The goal of the thesis is to analyze the strengths and weaknesses of solar PV business model and point out key factors that affect the efficiency of business model, the results are expected to help in creating new business strategy. The methodology of case study research is chosen as theoretical background to structure the design of the thesis indicating how to choose the right research method and conduction of a case study research. Business model canvas is adopted as the tool for analyzing the case studies of SolarCity and Sungevity. The results are presented through the comparison between the cases studies. Solar services and products, cost in customer acquisition, intellectual resource and powerful sales channels are identified as the major factors for TPO model.
Resumo:
Two different pathogenetic mechanisms are proposed for colorectal cancers. One, the so-called "classic pathway", is the most common and depends on multiple additive mutational events (germline and/or somatic) in tumor suppressor genes and oncogenes, frequently involving chromosomal deletions in key genomic regions. Methodologically this pathway is recognizable by the phenomenon of loss of heterozygosity. On the other hand, the "mutator pathway" depends on early mutational loss of the mismatch repair system (germline and/or somatic) leading to accelerated accumulation of gene mutations in critical target genes and progression to malignancy. Methodologically this second pathway is recognizable by the phenomenon of microsatellite instability. The distinction between these pathways seems to be more than academic since there is evidence that the tumors emerging from the mutator pathway have a better prognosis. We report here a very simple methodology based on a set of tri-, tetra- and pentanucleotide repeat microsatellites allowing the simultaneous study of microsatellite instability and loss of heterozygosity which could allocate 70% of the colorectal tumors to the classic or the mutator pathway. The ease of execution of the methodology makes it suitable for routine clinical typing
Resumo:
Laser additive manufacturing (LAM), known also as 3D printing, is a powder bed fusion (PBF) type of additive manufacturing (AM) technology used to manufacture metal parts layer by layer by assist of laser beam. The development of the technology from building just prototype parts to functional parts is due to design flexibility. And also possibility to manufacture tailored and optimised components in terms of performance and strength to weight ratio of final parts. The study of energy and raw material consumption in LAM is essential as it might facilitate the adoption and usage of the technique in manufacturing industries. The objective this thesis was find the impact of LAM on environmental and economic aspects and to conduct life cycle inventory of CNC machining and LAM in terms of energy and raw material consumption at production phases. Literature overview in this thesis include sustainability issues in manufacturing industries with focus on environmental and economic aspects. Also life cycle assessment and its applicability in manufacturing industry were studied. UPLCI-CO2PE! Initiative was identified as mostly applied exiting methodology to conduct LCI analysis in discrete manufacturing process like LAM. Many of the reviewed literature had focused to PBF of polymeric material and only few had considered metallic materials. The studies that had included metallic materials had only measured input and output energy or materials of the process and compared to different AM systems without comparing to any competitive process. Neither did any include effect of process variation when building metallic parts with LAM. Experimental testing were carried out to make dissimilar samples with CNC machining and LAM in this thesis. Test samples were designed to include part complexity and weight reductions. PUMA 2500Y lathe machine was used in the CNC machining whereas a modified research machine representing EOSINT M-series was used for the LAM. The raw material used for making the test pieces were stainless steel 316L bar (CNC machined parts) and stainless steel 316L powder (LAM built parts). An analysis of power, time, and the energy consumed in each of the manufacturing processes on production phase showed that LAM utilises more energy than CNC machining. The high energy consumption was as result of duration of production. Energy consumption profiles in CNC machining showed fluctuations with high and low power ranges. LAM energy usage within specific mode (standby, heating, process, sawing) remained relatively constant through the production. CNC machining was limited in terms of manufacturing freedom as it was not possible to manufacture all the designed sample by machining. And the one which was possible was aided with large amount of material removed as waste. Planning phase in LAM was shorter than in CNC machining as the latter required many preparation steps. Specific energy consumption (SEC) were estimated in LAM based on the practical results and assumed platform utilisation. The estimated platform utilisation showed SEC could reduce when more parts were placed in one build than it was in with the empirical results in this thesis (six parts).
Resumo:
Findings on the effects of weather on health, especially the effects of ambient temperature on overall morbidity, remain inconsistent. We conducted a time series study to examine the acute effects of meteorological factors (mainly air temperature) on daily hospital outpatient admissions for cardiovascular disease (CVD) in Zunyi City, China, from January 1, 2007 to November 30, 2009. We used the generalized additive model with penalized splines to analyze hospital outpatient admissions, climatic parameters, and covariate data. Results show that, in Zunyi, air temperature was associated with hospital outpatient admission for CVD. When air temperature was less than 10°C, hospital outpatient admissions for CVD increased 1.07-fold with each increase of 1°C, and when air temperature was more than 10°C, an increase in air temperature by 1°C was associated with a 0.99-fold decrease in hospital outpatient admissions for CVD over the previous year. Our analyses provided statistically significant evidence that in China meteorological factors have adverse effects on the health of the general population. Further research with consistent methodology is needed to clarify the magnitude of these effects and to show which populations and individuals are vulnerable.
Resumo:
In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.
Resumo:
The Graphite furnace atomic absorption spectrometry (GF AAS) was the technique chosen by the inorganic contamination laboratory (INCQ/ FIOCRUZ) to be validated and applied in routine analysis for arsenic detection and quantification. The selectivity, linearity, sensibility, detection, and quantification limits besides accuracy and precision parameters were studied and optimized under Stabilized Temperature Platform Furnace (STPF) conditions. The limit of detection obtained was 0.13 µg.L-1 and the limit of quantification was 1.04 µg.L-1, with an average precision, for total arsenic, less than 15% and an accuracy of 96%. To quantify the chemical species As(III) and As(V), an ion-exchange resin (Dowex 1X8, Cl- form) was used and the physical-chemical parameters were optimized resulting in a recuperation of 98% of As(III) and of 90% of As(V). The method was applied to groundwater, mineral water, and hemodialysis purified water samples. All results obtained were lower than the maximum limit values established by the legal Brazilian regulations, in effect, 50, 10, and 5 µg.L-1 para As total, As(III) e As(V), respectively. All results were statistically evaluated.
Resumo:
The objectives of this study were to develop the method of isotope analysis to quantify the carbon of C3 photosynthetic cycle in pulpy whole apple juice and to measure the legal limits based on Brazilian legislation in order to identify the beverages that do not conform to the Ministry of Agriculture, Livestock and Food Supply (MAPA). This beverage was produced in a laboratory according to the Brazilian law. Pulpy juices adulterated by the addition of sugarcane were also produced. The isotope analyses measured the relative isotope enrichment of the juices, their pulpy fractions (internal standard) and purified sugar. From those results, the quantity of C3 source was estimated by means of the isotope dilution equation. To determine the existence of adulteration in commercial juices, it was necessary to create a legal limit according to the Brazilian law. Three brands of commercial juices were analyzed. One was classified as adulterated. The legal limit enabled to clearly identify the juice that was not in conformity with the Brazilian law. The methodology developed proved efficient for quantifying the carbon of C3 origin in commercial pulpy apple juices.
Resumo:
This study sought to evaluate the acceptance of "dulce de leche" with coffee and whey. The results were analyzed through response surface, ANOVA, test of averages, histograms, and preference map correlating the global impression data with results of physical, physiochemical and sensory analysis. The response surface methodology, by itself, was not enough to find the best formulation. For ANOVA, test of averages, and preference map it was observed that the consumers' favorite "dulce de leche" were those of formulation 1 (10% whey and 1% coffee) and 2 (30% whey and 1% coffee), followed by formulation 9 (20% whey and 1.25% coffee). The acceptance of samples 1 and 2 was influenced by the higher acceptability in relation to the flavor and for presenting higher pH, L*, and b* values. It was observed that samples 1 and 2 presented higher purchase approval score and higher percentages of responses for the 'ideal' category in terms of sweetness and coffee flavor. It was found that consumers preferred the samples with low concentrations of coffee independent of the concentration of whey thus enabling the use of whey and coffee in the manufacture of dulce de leche, obtaining a new product.
Resumo:
This study aims to optimize an alternative method of extraction of carrageenan without previous alkaline treatment and ethanol precipitation using Response Surface Methodology (RSM). In order to introduce an innovation in the isolation step, atomization drying was used reducing the time for obtaining dry carrageenan powder. The effects of extraction time and temperature on yield, gel strength, and viscosity were evaluated. Furthermore, the extracted material was submitted to structural analysis, by infrared spectroscopy and nuclear magnetic resonance spectroscopy (¹H-NMR), and chemical composition analysis. Results showed that the generated regression models adequately explained the data variation. Carrageenan yield and gel viscosity were influenced only by the extraction temperature. However, gel strength was influenced by both, extraction time and extraction temperature. Optimal extraction conditions were 74 ºC and 4 hours. In these conditions, the carrageenan extract properties determined by the polynomial model were 31.17%, 158.27 g.cm-2, and 29.5 cP for yield, gel strength, and viscosity, respectively, while under the experimental conditions they were 35.8 ± 4.68%, 112.50 ± 4.96 g.cm-2, and 16.01 ± 1.03 cP, respectively. The chemical composition, nuclear magnetic resonance spectroscopy, and infrared spectroscopy analyses showed that the crude carrageenan extracted is composed mainly of κ-carrageenan.
Resumo:
The aims of this study were to use the isotope analysis method to quantify the carbon of C3 photosynthetic cycle in commercial apple nectars and to determine the legal limit to identify the beverages that do not conform to the safety standards established by the Brazilian Ministry of Agriculture, Livestock and Food Supply. These beverages (apple nectars) were produced in the laboratory according to the Brazilian legislation. Adulterated nectars were also produced with an amount of pulp juice below the permitted threshold limit value. The δ13C values of the apple nectars and their fractions (pulp and purified sugar) were measured to quantify the C3 source percentage. In order to demonstrate the existence of adulteration, the values found were compared to the limit values established by the Brazilian Law. All commercial apple nectars analyzed were within the legal limits, which enabled to identify the nectars that were in conformity with the Brazilian Law. The isotopic methodology developed proved efficient to quantify the carbon of C3 origin in commercial apple nectars.
Resumo:
The objective of this study was to analyze retinol equivalent and iron content in different food composition tables and nutritional evaluation software programs. A literature search was conduct to identify tables and software available in Brazil containing information about retinol equivalent and iron content that are currently used by nutritionists. Ten tables and five software programs were selected for this study. The methodology used to present the retinol equivalent and iron content was evaluated and no pattern to obtain such content was found in the tables and software programs analyzed. Only one of the tables had enough information for the calculation of retinol equivalents; this table is recommended to all Latin America As for the iron content, three of the tables analyzed stand out and therefore should be used; two of them are based on national foods and the other is recommended for use in all Latin America countries. None of the software programs evaluated use the conversion factors suggested by IVACG to assess the vitamin A content in foods. Special attention should be given to the content of iron provided in the software programs since they use tables as international sources and fortified foods.
Resumo:
Assessing fish consumption is complex and involves several factors; however, the use of questionnaires in surveys and the use of the Internet as tool to collect data have been considered promising approaches. Therefore, the objective of this research was to design a data collection technique using a questionnaire to assess fish consumption by making it available on a specific home page on the Internet. A bibliographical survey or review was carried out to identify the features of the instrument, and therefore pre-tests were conducted with previous instruments, followed by the Focus Group technique. Specialists then performed an analysis and conducted an online pre-test. Multivariate data analysis was applied using the SmartPLS software. The results indicate that 1.966 participants belonging to the University of São Paulo (USP) community participated in the test, and after the exclusion of some variables, a statistically significant results were obtained. The final constructs comprised consumption, quality, and general characteristics. The instrument consisted of behavioral statements in a 5-point Likert scale and multiple-choice questions. The Cronbach's alpha reliability coefficient was 0.66 for general characteristics, 0.98 for quality, and 0.91 for consumption, which indicate good reliability of the instrument. In conclusion, the results proved that the Internet assessment is efficient. The instrument of analysis allowed us to better understand the process of buying and consuming fish in the country, and it can be used as base for further research.
Resumo:
The cellular structure of healthy food products, with added dietary fiber and low in calories, is an important factor that contributes to the assessment of quality, which can be quantified by image analysis of visual texture. This study seeks to compare image analysis techniques (binarization using Otsu’s method and the default ImageJ algorithm, a variation of the iterative intermeans method) for quantification of differences in the crumb structure of breads made with different percentages of whole-wheat flour and fat replacer, and discuss the behavior of the parameters number of cells, mean cell area, cell density, and circularity using response surface methodology. Comparative analysis of the results achieved with the Otsu and default ImageJ algorithms showed a significant difference between the studied parameters. The Otsu method demonstrated the crumb structure of the analyzed breads more reliably than the default ImageJ algorithm, and is thus the most suitable in terms of structural representation of the crumb texture.