991 resultados para code source
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
In Lamium album, sucrose and raffinose-family oligosaccharides are the major products of photosynthesis that are stored in leaves. Using gas analysis and 14CO2 feeding, we compared photosynthesis and the partitioning of recently-fixed carbon in plants where sink activity was lowered by excision of flowers and chilling of roots with those where sink activity was not modified. Reduction in sink activity led to a reduction in the maximum rate of photosynthesis, to retention of fixed carbon in source leaves and to the progressive accumulation of raffinose-family oligosaccharides. This ultimately affected the extractable activities of invertase and sucrose phosphate synthase. At the end of the light period, invertase activity was significantly higher in treated plants. By contrast sucrose phosphate synthase activity was significantly lower in treated plants. We propose that reducing sink activity in L. album is associated with a shift in metabolism away from starch and sucrose synthesis and towards sucrose catabolism, galactinol utilisation and the synthesis of raffinose-family oligosaccharides.
Resumo:
The purpose of this thesis is to study, investigate and compare usability of open source cms. The thesis examines and compares usability aspect of some open source cms. The research is divided into two complementary parts –theoretical part and analytical part. The theoretical part mainly describes open source web content management systems, usability and the evaluation methods. The analytical part is to compare and analyze the results found from the empirical research. Heuristic evaluation method was used to measure usability problems in the interfaces. The study is fairly limited in scope; six tasks were designed and implemented in each interface for discovering defects in the interfaces. Usability problems were rated according to their level of severity. Time it took by each task, level of problem’s severity and type of heuristics violated will be recorded, analyzed and compared. The results of this study indicate that the comparing systems provide usable interfaces, and WordPress is recognized as the most usable system.
Resumo:
The main generator source of a longitudinal muscle contraction was identified as an M (mechanical-stimulus-sensitive) circuit composed of a presynaptic M-1 neuron and a postsynaptic M-2 neuron in the ventral nerve cord of the earthworm, Amynthas hawayanus, by simultaneous intracellular response recording and Lucifer Yellow-CH injection with two microelectrodes. Five-peaked responses were evoked in both neurons by a mechanical, but not by an electrical, stimulus to the mechanoreceptor in the shaft of a seta at the opposite side of an epidermis-muscle-nerve-cord preparation. This response was correlated to 84% of the amplitude, 73% of the rising rate and 81% of the duration of a longitudinal muscle contraction recorded by a mechano-electrical transducer after eliminating the other possible generator sources by partitioning the epidermis-muscle piece of this preparation. The pre- and postsynaptic relationship between these two neurons was determined by alternately stimulating and recording with two microelectrodes. Images of the Lucifer Yellow-CH-filled M-1 and M-2 neurons showed that both of them are composed of bundles of longitudinal processes situated on the side of the nerve cord opposite to stimulation. The M-1 neuron has an afferent process (A1) in the first nerve at the stimulated side of this preparation and the M-2 neuron has two efferent processes (E1 and E3) in the first and third nerves at the recording side where their effector muscle cell was identified by a third microelectrode.
Resumo:
A constant facilitation of responses evoked in the earthworm muscle contraction generator neurons by responses evoked in the neurons of its peripheral nervous system was demonstrated. It is based on the proposal that these two responses are bifurcations of an afferent response evoked by the same peripheral mechanical stimulus but converging again on this central neuron. A single-peaked generator response without facilitation was demonstrated by sectioning the afferent route of the peripheral facilitatory modulatory response, or conditioning response (CR). The multipeaked response could be restored by restimulating the sectioned modulatory neuron with an intracellular substitutive conditioning stimulus (SCS). These multi-peaked responses were proposed to be the result of reverberating the original single peaked unconditioned response (UR) through a parallel (P) neuronal circuit which receives the facilitation of the peripheral modulatory neuron. This peripheral modulatory neuron was named "Peri-Kästchen" (PK) neuron because it has about 20 peripheral processes distributed on the surface of a Kästchen of longitudinal muscle cells on the body wall of this preparation as revealed by the Lucifer Yellow-CH-filling method.
Resumo:
Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.
Resumo:
This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.
Resumo:
The use of exact coordinates of pebbles and fuel particles of pebble bed reactor modelling becoming possible in Monte Carlo reactor physics calculations is an important development step. This allows exact modelling of pebble bed reactors with realistic pebble beds without the placing of pebbles in regular lattices. In this study the multiplication coefficient of the HTR-10 pebble bed reactor is calculated with the Serpent reactor physics code and, using this multiplication coefficient, the amount of pebbles required for the critical load of the reactor. The multiplication coefficient is calculated using pebble beds produced with the discrete element method and three different material libraries in order to compare the results. The received results are lower than those from measured at the experimental reactor and somewhat lower than those gained with other codes in earlier studies.
Resumo:
Extant research on exchange-listed firms has acknowledged that the concentration of ownership and the identity of owners make a difference. In addition, studies indicate that firms with a dominant owner outperform firms with dispersed ownership. During the last few years, scholars have identified one group of owners, in particular, whose ownership stake in publicly listed firm is positively related to performance: the business family. While acknowledging that family firms represent a unique organizational form, scholars have identified various concepts and theories in order to understand how the family influences organizational processes and firm performance. Despite multitude of research, scholars have not been able to present clear results on how firm performance is actually impacted by the family. In other words, studies comparing the performance of listed family and other types of firms have remained descriptive in nature since they lack empirical data and confirmation from the family business representatives. What seems to be missing is a convincing theory that links the involvement and behavioral consequences. Accordingly, scholars have not yet come to a mutual understanding of what precisely constitutes a family business. The variety of different definitions and theories has made comparability of different results difficult for instance. These two issues have hampered the development of a rigorous theory of family business. The overall objective of this study is to describe and understand how the family as a dominant owner can enhance firm performance, and can act a source of sustainable success in listed companies. In more detail, in order to develop understanding of the unique factors that can act as competitive advantages for listed family firms, this study is based on a qualitative approach and aims at theory development, not theory verification. The data in this study consist of 16 thematic interviews with CEOs, members of the board, supervisory board chairs, and founders of Finnish listed-family firms. The study consists of two parts. The first part introduces the research topic, research paradigm, methods, and publications, and also discusses the overall outcomes and contributions of the publications. The second part consists of four publications that address the research questions from different viewpoints. The analyses of this study indicate that family ownership in listed companies represents a structure that differs from the traditional views of agency and stewardship, as well as from resource-based and stakeholder views. As opposed to these theories and shareholder capitalism which consider humans as individualistic, opportunistic, and self-serving, and assume that the behaviors of an investor are based on the incentives and motivations to maximize private profits, the family owners form a collective social unit that is motivated to act together toward their mutual purpose or benefit. In addition, socio-emotional and psychological elements of ownership define the family members as owners, rather than the legal and financial dimensions of ownership. That is, collective psychological ownership of family over the business (F-CPO) can be seen as a construct that comprehensively captures the fusion between the family and the business. Moreover, it captures the realized, rather than merely potential, family influence on and interaction with the business, and thereby brings more theoretical clarity of the nature of the fusion between the family and the business, and offers a solution to the problem of family business definition. This doctoral dissertation provides academics, policy-makers, family business practitioners, and the society at large with many implications considering family and business relationships.
Resumo:
We have developed a software called pp-Blast that uses the publicly available Blast package and PVM (parallel virtual machine) to partition a multi-sequence query across a set of nodes with replicated or shared databases. Benchmark tests show that pp-Blast running in a cluster of 14 PCs outperformed conventional Blast running in large servers. In addition, using pp-Blast and the cluster we were able to map all human cDNAs onto the draft of the human genome in less than 6 days. We propose here that the cost/benefit ratio of pp-Blast makes it appropriate for large-scale sequence analysis. The source code and configuration files for pp-Blast are available at http://www.ludwig.org.br/biocomp/tools/pp-blast.
Resumo:
We transplanted 47 patients with Fanconi anemia using an alternative source of hematopoietic cells. The patients were assigned to the following groups: group 1, unrelated bone marrow (N = 15); group 2, unrelated cord blood (N = 17), and group 3, related non-sibling bone marrow (N = 15). Twenty-four patients (51%) had complete engraftment, which was not influenced by gender (P = 0.87), age (P = 0.45), dose of cyclophosphamide (P = 0.80), nucleated cell dose infused (P = 0.60), or use of anti-T serotherapy (P = 0.20). Favorable factors for superior engraftment were full HLA compatibility (independent of the source of cells; P = 0.007) and use of a fludarabine-based conditioning regimen (P = 0.046). Unfavorable factors were > or = 25 transfusions pre-transplant (P = 0.011) and degree of HLA disparity (P = 0.007). Intensity of mucositis (P = 0.50) and use of androgen prior to transplant had no influence on survival (P = 0.80). Acute graft-versus-host disease (GVHD) grade II-IV and chronic GVHD were diagnosed in 47 and 23% of available patients, respectively, and infections prevailed as the main cause of death, associated or not with GVHD. Eighteen patients are alive, the Kaplan-Meyer overall survival is 38% at ~8 years, and the best results were obtained with related non-sibling bone marrow patients. Three recommendations emerged from the present study: fludarabine as part of conditioning, transplant in patients with <25 transfusions and avoidance of HLA disparity. In addition, an extended family search (even when consanguinity is not present) seeking for a related non-sibling donor is highly recommended.
Resumo:
The pipeline for macro- and microarray analyses (PMmA) is a set of scripts with a web interface developed to analyze DNA array data generated by array image quantification software. PMmA is designed for use with single- or double-color array data and to work as a pipeline in five classes (data format, normalization, data analysis, clustering, and array maps). It can also be used as a plugin in the BioArray Software Environment, an open-source database for array analysis, or used in a local version of the web service. All scripts in PMmA were developed in the PERL programming language and statistical analysis functions were implemented in the R statistical language. Consequently, our package is a platform-independent software. Our algorithms can correctly select almost 90% of the differentially expressed genes, showing a superior performance compared to other methods of analysis. The pipeline software has been applied to 1536 expressed sequence tags macroarray public data of sugarcane exposed to cold for 3 to 48 h. PMmA identified thirty cold-responsive genes previously unidentified in this public dataset. Fourteen genes were up-regulated, two had a variable expression and the other fourteen were down-regulated in the treatments. These new findings certainly were a consequence of using a superior statistical analysis approach, since the original study did not take into account the dependence of data variability on the average signal intensity of each gene. The web interface, supplementary information, and the package source code are available, free, to non-commercial users at http://ipe.cbmeg.unicamp.br/pub/PMmA.
Resumo:
In the present study, we investigated the in vitro anti-tumoral activities of fractions from aqueous extracts of the husk fiber of the typical A and common varieties of Cocos nucifera (Palmae). Cytotoxicity against leukemia cells was determined by the 3-[4,5-dimethylthiazol-2-yl]-2,5-diphenyltetrazolium bromide (MTT) assay. Cells (2 x 104/well) were incubated with 0, 5, 50 or 500 µg/mL high- or low-molecular weight fractions for 48 h, treated with MTT and absorbance was measured with an ELISA reader. The results showed that both varieties have almost similar antitumoral activity against the leukemia cell line K562 (60.1 ± 8.5 and 47.5 ± 11.9% for the typical A and common varieties, respectively). Separation of the crude extracts with Amicon membranes yielded fractions with molecular weights ranging in size from 1-3 kDa (fraction A) to 3-10 kDa (fraction B) and to more than 10 kDa (fraction C). Cells were treated with 500 µg/mL of these fractions and cytotoxicity was evaluated by MTT. Fractions ranging in molecular weight from 1-10 kDa had higher cytotoxicity. Interestingly, C. nucifera extracts were also active against Lucena 1, a multidrug-resistant leukemia cell line. Their cytotoxicity against this cell line was about 50% (51.9 ± 3.2 and 56.3 ± 2.9 for varieties typical A and common, respectively). Since the common C. nucifera variety is extensively cultured in Brazil and the husk fiber is its industrial by-product, the results obtained in the present study suggest that it might be a very inexpensive source of new antineoplastic and anti-multidrug resistant drugs that warrants further investigation.