929 resultados para Source Code Analysis
Resumo:
We apply the Bogoliubov Averaging Method to the study of the vibrations of an elastic foundation, forced by a Non-ideal energy source. The considered model consists of a portal plane frame with quadratic nonlinearities, with internal resonance 1:2, supporting a direct current motor with limited power. The non-ideal excitation is in primary resonance in the order of one-half with the second mode frequency. The results of the averaging method, plotted in time evolution curve and phase diagrams are compared to those obtained by numerically integrating of the original differential equations. The presence of the saturation phenomenon is verified by analytical procedures.
Resumo:
Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.
Resumo:
The aim of this study is to find out how game companies perceive the three traditional funding sources and how well their opinions and needs are reflected on the choices they make. To accomplish this, 20 game companies were questioned about multiple topics with the help of Tekes and Neogames. The results of this study show that game developers clearly differentiate the three major funding sources and the public sector ends up being the most significant source of external funding. This study also points out that most game companies are indeed facing issues in acquiring funding as well as various other resources.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
In Lamium album, sucrose and raffinose-family oligosaccharides are the major products of photosynthesis that are stored in leaves. Using gas analysis and 14CO2 feeding, we compared photosynthesis and the partitioning of recently-fixed carbon in plants where sink activity was lowered by excision of flowers and chilling of roots with those where sink activity was not modified. Reduction in sink activity led to a reduction in the maximum rate of photosynthesis, to retention of fixed carbon in source leaves and to the progressive accumulation of raffinose-family oligosaccharides. This ultimately affected the extractable activities of invertase and sucrose phosphate synthase. At the end of the light period, invertase activity was significantly higher in treated plants. By contrast sucrose phosphate synthase activity was significantly lower in treated plants. We propose that reducing sink activity in L. album is associated with a shift in metabolism away from starch and sucrose synthesis and towards sucrose catabolism, galactinol utilisation and the synthesis of raffinose-family oligosaccharides.
Resumo:
Six brachytic maize varieties were crossed in a diallel mating scheme. Both varieties and crosses were grown hydroponically in a greenhouse, in randomized complete blocks with three replications in two seasons. Four brachytic double cross hybrids were used as checks. Twenty-eight days after planting, data for eight traits were taken for weights of the total plant (TPW), top plant (TOW), total roots (TRW), seminal roots (SRW), and nodal roots (NRW) and number of total roots (TRN), seminal roots (SRN), and nodal roots (NRN). Ten plants were measured in each plot and all the analyses were accomplished with plot means. In the diallel cross the top plant contributed 57.6% of the total plant weight, for seminal roots 15.4%, and for nodal roots 27.0%. Root number distribution was 36.7% seminal roots and 63.3% nodal roots. Approximately the same ratios were observed in the checks. The average heterosis effects were nonsignificant for all traits; the other components of heterosis (variety and specific heterosis) also were not important sources of variation in young plants. The overall results suggest that nonadditive gene action is not an important source of variation for the plant and root system of young plants. The positive correlation coefficients for combinations of traits indicated that they are under the control of a polygenic system
Resumo:
Concentration-response curves of isometric tension studies on isolated blood vessels are obtained traditionally. Although parameters such as Imax, EC50 and pA2 may be readily calculated, this method does not provide information on the temporal profile of the responses or the actual nature of the reaction curves. Computerized data acquisition systems can be used to obtain average data that represent a new source of otherwise inaccessible information, since early and late responses may be observed separately in detail
Resumo:
Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
Almost identical polyglutamine-containing proteins with unknown structures have been found in human, mouse and rat genomes (GenBank AJ277365, AF525300, AY879229). We infer that an identical new gene (RING) finger domain of real interest is located in each C-terminal segment. A three-dimensional (3-D) model was generated by remote homology modeling and the functional implications are discussed. The model consists of 65 residues from terminal position 707 to 772 of the human protein with a total length of 796 residues. The 3-D model predicts a ubiquitin-protein ligase (E3) as a binding site for ubiquitin-conjugating enzyme (E2). Both enzymes are part of the ubiquitin pathway to label unwanted proteins for subsequent enzymatic degradation. The molecular contact specificities are suggested for both the substrate recognition and the residues at the possible E2-binding surface. The predicted structure, of a ubiquitin-protein ligase (E3, enzyme class number 6.3.2.19, CATH code 3.30.40.10.4) may contribute to explain the process of ubiquitination. The 3-D model supports the idea of a C3HC4-RING finger with a partially new pattern. The putative E2-binding site is formed by a shallow hydrophobic groove on the surface adjacent to the helix and one zinc finger (L722, C739, P740, P741, R744). Solvent-exposed hydrophobic amino acids lie around both zinc fingers (I717, L722, F738, or P765, L766, V767, V733, P734). The 3-D structure was deposited in the protein databank theoretical model repository (2B9G, RCSB Protein Data Bank, NJ).
Resumo:
Acylcarnitine profiling by electrospray ionization tandem mass spectrometry (ESI-MS/MS) is a potent tool for the diagnosis and screening of fatty acid oxidation and organic acid disorders. Few studies have analyzed free carnitine and acylcarnitines in dried blood spots (DBS) of umbilical cord blood (CB) and the postnatal changes in the concentrations of these analytes. We have investigated these metabolites in healthy exclusively breastfed neonates and examined possible effects of birth weight and gestational age. DBS of CB were collected from 162 adequate for gestational age neonates. Paired DBS of heel-prick blood were collected 4-8 days after birth from 106 of these neonates, the majority exclusively breastfed. Methanol extracts of DBS with deuterium-labeled internal standards were derivatized before analysis by ESI-MS/MS. Most of the analytes were measured using a full-scan method. The levels of the major long-chain acylcarnitines, palmitoylcarnitine, stearoylcarnitine, and oleoylcarnitine, increased by 27, 12, and 109%, respectively, in the first week of life. Free carnitine and acetylcarnitine had a modest increase: 8 and 11%, respectively. Propionylcarnitine presented a different behavior, decreasing 9% during the period. The correlations between birth weight or gestational age and the concentrations of the analytes in DBS were weak (r £ 0.20) or nonsignificant. Adaptation to breast milk as the sole source of nutrients can explain the increase of these metabolites along the early neonatal period. Acylcarnitine profiling in CB should have a role in the early detection of metabolic disorders in high-risk neonates.
Resumo:
Cookies were prepared with the replacement of 20% of wheat flour by chemically (alkaline hydrogen peroxide) and physically (extrusion) treated oat hulls, with the objective to investigate the possibility of use of this modified material. Cookies elaborated with the untreated hulls were used as control. Cookies were evaluated for their physical (spread ratio, specific volume and color) and sensory characteristics, and no difference was detected (p<0.05) among the cookies in relation to the physical properties. Triangule test, used to verify difference (p<0.05) among treated and untreated cookies, confirmed the efficiency of the treatment in sensory level. The acceptance level of cookies with treated fiber was evaluated by potential consumers of the product, obtaining 91% acceptance. The cookies presented 10.6 g of dietary fiber per 100 g of product.
Resumo:
A small break loss-of-coolant accident (SBLOCA) is one of problems investigated in an NPP operation. Such accident can be analyzed using an experiment facility and TRACE thermal-hydraulic system code. A series of SBLOCA experiments was carried out on Parallel Channel Test Loop (PACTEL) facility, exploited together with Technical Research Centre of Finland VTT Energy and Lappeenranta University of Technology (LUT), in order to investigate two-phase phenomena related to a VVER-type reactor. The experiments and a TRACE model of the PACTEL facility are described in the paper. In addition, there is the TRACE code description with main field equations. At the work, calculations of a SBLOCA series are implemented and after the calculations, the thesis discusses the validation of TRACE and concludes with an assessment of the usefulness and accuracy of the code in calculating small breaks.
Resumo:
The objectives of this study were to develop the method of isotope analysis to quantify the carbon of C3 photosynthetic cycle in pulpy whole apple juice and to measure the legal limits based on Brazilian legislation in order to identify the beverages that do not conform to the Ministry of Agriculture, Livestock and Food Supply (MAPA). This beverage was produced in a laboratory according to the Brazilian law. Pulpy juices adulterated by the addition of sugarcane were also produced. The isotope analyses measured the relative isotope enrichment of the juices, their pulpy fractions (internal standard) and purified sugar. From those results, the quantity of C3 source was estimated by means of the isotope dilution equation. To determine the existence of adulteration in commercial juices, it was necessary to create a legal limit according to the Brazilian law. Three brands of commercial juices were analyzed. One was classified as adulterated. The legal limit enabled to clearly identify the juice that was not in conformity with the Brazilian law. The methodology developed proved efficient for quantifying the carbon of C3 origin in commercial pulpy apple juices.