15 resultados para Setup

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question what a business-to-business (B2B) collaboration setup and enactment application-system should look like remains open. An important element of such collaboration constitutes the inter-organizational disclosure of business-process details so that the opposing parties may protect their business secrets. For that purpose, eSourcing [37] has been developed as a general businessprocess collaboration concept in the framework of the EU research project Cross- Work. The eSourcing characteristics are guiding for the design and evaluation of an eSourcing Reference Architecture (eSRA) that serves as a starting point for software developers of B2B-collaboration systems. In this paper we present the results of a scenario-based evaluation method conducted with the earlier specified eSourcing Architecture (eSA) that generates as results risks, sensitivity, and tradeoff points that must be paid attention to if eSA is implemented. Additionally, the evaluation method detects shortcomings of eSA in terms of integrated components that are required for electronic B2B-collaboration. The evaluation results are used for the specification of eSRA, which comprises all extensions for incorporating the results of the scenario-based evaluation, on three refinement levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent epidemiological studies have shown a consistent association of the mass concentration of urban air thoracic (PM10) and fine (PM2.5) particles with mortality and morbidity among cardiorespiratory patients. However, the chemical characteristics of different particulate size ranges and the biological mechanisms responsible for these adverse health effects are not well known. The principal aims of this thesis were to validate a high volume cascade impactor (HVCI) for the collection of particulate matter for physicochemical and toxicological studies, and to make an in-depth chemical and source characterisation of samples collected during different pollution situations. The particulate samples were collected with the HVCI, virtual impactors and a Berner low pressure impactor in six European cities: Helsinki, Duisburg, Prague, Amsterdam, Barcelona and Athens. The samples were analysed for particle mass, common ions, total and water-soluble elements as well as elemental and organic carbon. Laboratory calibration and field comparisons indicated that the HVCI can provide a unique large capacity, high efficiency sampling of size-segregated aerosol particles. The cutoff sizes of the recommended HVCI configuration were 2.4, 0.9 and 0.2 μm. The HVCI mass concentrations were in a good agreement with the reference methods, but the chemical composition of especially the fine particulate samples showed some differences. This implies that the chemical characterization of the exposure variable in toxicological studies needs to be done from the same HVCI samples as used in cell and animal studies. The data from parallel, low volume reference samplers provide valuable additional information for chemical mass closure and source assessment. The major components of PM2.5 in the virtual impactor samples were carbonaceous compounds, secondary inorganic ions and sea salt, whereas those of coarse particles (PM2.5-10) were soil-derived compounds, carbonaceous compounds, sea salt and nitrate. The major and minor components together accounted for 77-106% and 77-96% of the gravimetrically-measured masses of fine and coarse particles, respectively. Relatively large differences between sampling campaigns were observed in the organic carbon content of the PM2.5 samples as well as the mineral composition of the PM2.5-10 samples. A source assessment based on chemical tracers suggested clear differences in the dominant sources (e.g. traffic, residential heating with solid fuels, metal industry plants, regional or long-range transport) between the sampling campaigns. In summary, the field campaigns exhibited different profiles with regard to particulate sources, size distribution and chemical composition, thus, providing a highly useful setup for toxicological studies on the size-segregated HVCI samples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis contains five experimental spectroscopic studies that probe the vibration-rotation energy level structure of acetylene and some of its isotopologues. The emphasis is on the development of laser spectroscopic methods for high-resolution molecular spectroscopy. Three of the experiments use cavity ringdown spectroscopy. One is a standard setup that employs a non-frequency stabilised continuous wave laser as a source. In the other two experiments, the same laser is actively frequency stabilised to the ringdown cavity. This development allows for increased repetition rate of the experimental signal and thus the spectroscopic sensitivity of the method is improved. These setups are applied to the recording of several vibration-rotation overtone bands of both H(12)C(12)CH and H(13)C(13)CH. An intra-cavity laser absorption spectroscopy setup that uses a commercial continuous wave ring laser and a Fourier transform interferometer is presented. The configuration of the laser is found to be sub-optimal for high-sensitivity work but the spectroscopic results are good and show the viability of this type of approach. Several ro-vibrational bands of carbon-13 substituted acetylenes are recorded and analysed. Compared with earlier work, the signal-to-noise ratio of a laser-induced dispersed infrared fluorescence experiment is enhanced by more than one order of magnitude by exploiting the geometric characteristics of the setup. The higher sensitivity of the spectrometer leads to the observation of two new symmetric vibrational states of H(12)C(12)CH. The precision of the spectroscopic parameters of some previously published symmetric states is also improved. An interesting collisional energy transfer process is observed for the excited vibrational states and this phenomenon is explained by a simple step-down model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Topic detection and tracking (TDT) is an area of information retrieval research the focus of which revolves around news events. The problems TDT deals with relate to segmenting news text into cohesive stories, detecting something new, previously unreported, tracking the development of a previously reported event, and grouping together news that discuss the same event. The performance of the traditional information retrieval techniques based on full-text similarity has remained inadequate for online production systems. It has been difficult to make the distinction between same and similar events. In this work, we explore ways of representing and comparing news documents in order to detect new events and track their development. First, however, we put forward a conceptual analysis of the notions of topic and event. The purpose is to clarify the terminology and align it with the process of news-making and the tradition of story-telling. Second, we present a framework for document similarity that is based on semantic classes, i.e., groups of words with similar meaning. We adopt people, organizations, and locations as semantic classes in addition to general terms. As each semantic class can be assigned its own similarity measure, document similarity can make use of ontologies, e.g., geographical taxonomies. The documents are compared class-wise, and the outcome is a weighted combination of class-wise similarities. Third, we incorporate temporal information into document similarity. We formalize the natural language temporal expressions occurring in the text, and use them to anchor the rest of the terms onto the time-line. Upon comparing documents for event-based similarity, we look not only at matching terms, but also how near their anchors are on the time-line. Fourth, we experiment with an adaptive variant of the semantic class similarity system. The news reflect changes in the real world, and in order to keep up, the system has to change its behavior based on the contents of the news stream. We put forward two strategies for rebuilding the topic representations and report experiment results. We run experiments with three annotated TDT corpora. The use of semantic classes increased the effectiveness of topic tracking by 10-30\% depending on the experimental setup. The gain in spotting new events remained lower, around 3-4\%. The anchoring the text to a time-line based on the temporal expressions gave a further 10\% increase the effectiveness of topic tracking. The gains in detecting new events, again, remained smaller. The adaptive systems did not improve the tracking results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In my thesis I have been studying the effects of population fragmentation and extinction-recolonization dynamics on genetic and evolutionary processes in the Glanville fritillary butterfly (Melitaea cinxia). By conducting crosses within and among newly-colonized populations and using several fitness measures, I found a strong decrease in fitness following colonization by a few related individuals, and a strong negative relationship between parental relatedness and offspring fitness. Thereafter, I was interested in determining the number and relatedness of individuals colonizing new populations, which I did using a set of microsatellites I had previously developed for this species. Additionally, I am interested in the evolution of key life-history traits. By following the lifetime reproductive success of males emerging at different times in a semi-natural setup, I demonstrated that protandry is adaptive in males, and I was able to rule out, for M. cinxia, alternative incidental hypotheses evoked to explain the evolution of protandry in insects. Finally, in work I did together with Prof. Hanna Kokko, I am proposing bet-hedging as a new mechanism that could explain the evolution of polyandry in M. cinxia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Transition Radiation Tracker (TRT) of the ATLAS experiment at the LHC is part of the Inner Detector. It is designed as a robust and powerful gaseous detector that provides tracking through individual drift-tubes (straws) as well as particle identification via transition radiation (TR) detection. The straw tubes are operated with Xe-CO2-O2 70/27/3, a gas that combines the advantages of efficient TR absorption, a short electron drift time and minimum ageing effects. The modules of the barrel part of the TRT were built in the United States while the end-cap wheels are assembled at two Russian institutes. Acceptance tests of barrel modules and end-cap wheels are performed at CERN before assembly and integration with the Semiconductor Tracker (SCT) and the Pixel Detector. This thesis first describes simulations the TRT straw tube. The argon-based acceptance gas mixture as well as two xenon-based operating gases are examined for its properties. Drift velocities and Townsend coefficients are computed with the help of the program Magboltz and used to study electron drift and multiplication in the straw using the software Garfield. The inclusion of Penning transfers in the avalanche process leads to remarkable agreements with experimental data. A high level of cleanliness in the TRT s acceptance test gas system is indispensable. To monitor gas purity, a small straw tube detector has been constructed and extensively used to study the ageing behaviour of the straw tube in Ar-CO2. A variety of ageing tests are presented and discussed. Acceptance tests for the TRT survey dimensions, wire tension, gas-tightness, high-voltage stability and gas gain uniformity along each individual straw. The thesis gives details on acceptance criteria and measurement methods in the case of the end-cap wheels. Special focus is put on wire tension and straw straightness. The effect of geometrically deformed straws on gas gain and energy resolution is examined in an experimental setup and compared to simulation studies. An overview of the most important results from the end-cap wheels tested up to this point is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in the formation of a new phase inside a mother phase. Two main forms of nucleation can be distinguished. In homogeneous nucleation, the new phase is formed in a uniform substance. In heterogeneous nucleation, on the other hand, the new phase emerges on a pre-existing surface (nucleation site). Nucleation is the source of about 30% of all atmospheric aerosol which in turn has noticeable health effects and a significant impact on climate. Nucleation can be observed in the atmosphere, studied experimentally in the laboratory and is the subject of ongoing theoretical research. This thesis attempts to be a link between experiment and theory. By comparing simulation results to experimental data, the aim is to (i) better understand the experiments and (ii) determine where the theory needs improvement. Computational fluid dynamics (CFD) tools were used to simulate homogeneous onecomponent nucleation of n-alcohols in argon and helium as carrier gases, homogeneous nucleation in the water-sulfuric acid-system, and heterogeneous nucleation of water vapor on silver particles. In the nucleation of n-alcohols, vapor depletion, carrier gas effect and carrier gas pressure effect were evaluated, with a special focus on the pressure effect whose dependence on vapor and carrier gas properties could be specified. The investigation of nucleation in the water-sulfuric acid-system included a thorough analysis of the experimental setup, determining flow conditions, vapor losses, and nucleation zone. Experimental nucleation rates were compared to various theoretical approaches. We found that none of the considered theoretical descriptions of nucleation captured the role of water in the process at all relative humidities. Heterogeneous nucleation was studied in the activation of silver particles in a TSI 3785 particle counter which uses water as its working fluid. The role of the contact angle was investigated and the influence of incoming particle concentrations and homogeneous nucleation on counting efficiency determined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When ordinary nuclear matter is heated to a high temperature of ~ 10^12 K, it undergoes a deconfinement transition to a new phase, strongly interacting quark-gluon plasma. While the color charged fundamental constituents of the nuclei, the quarks and gluons, are at low temperatures permanently confined inside color neutral hadrons, in the plasma the color degrees of freedom become dominant over nuclear, rather than merely nucleonic, volumes. Quantum Chromodynamics (QCD) is the accepted theory of the strong interactions, and confines quarks and gluons inside hadrons. The theory was formulated in early seventies, but deriving first principles predictions from it still remains a challenge, and novel methods of studying it are needed. One such method is dimensional reduction, in which the high temperature dynamics of static observables of the full four-dimensional theory are described using a simpler three-dimensional effective theory, having only the static modes of the various fields as its degrees of freedom. A perturbatively constructed effective theory is known to provide a good description of the plasma at high temperatures, where asymptotic freedom makes the gauge coupling small. In addition to this, numerical lattice simulations have, however, shown that the perturbatively constructed theory gives a surprisingly good description of the plasma all the way down to temperatures a few times the transition temperature. Near the critical temperature, the effective theory, however, ceases to give a valid description of the physics, since it fails to respect the approximate center symmetry of the full theory. The symmetry plays a key role in the dynamics near the phase transition, and thus one expects that the regime of validity of the dimensionally reduced theories can be significantly extended towards the deconfinement transition by incorporating the center symmetry in them. In the introductory part of the thesis, the status of dimensionally reduced effective theories of high temperature QCD is reviewed, placing emphasis on the phase structure of the theories. In the first research paper included in the thesis, the non-perturbative input required in computing the g^6 term in the weak coupling expansion of the pressure of QCD is computed in the effective theory framework at an arbitrary number of colors. The two last papers on the other hand focus on the construction of the center-symmetric effective theories, and subsequently the first non-perturbative studies of these theories are presented. Non-perturbative lattice simulations of a center-symmetric effective theory for SU(2) Yang-Mills theory show --- in sharp contrast to the perturbative setup --- that the effective theory accommodates a phase transition in the correct universality class of the full theory. This transition is seen to take place at a value of the effective theory coupling constant that is consistent with the full theory coupling at the critical temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Inter-enterprise collaboration has become essential for the success of enterprises. As competition increasingly takes place between supply chains and networks of enterprises, there is a strategic business need to participate in multiple collaborations simultaneously. Collaborations based on an open market of autonomous actors set special requirements for computing facilities supporting the setup and management of these business networks of enterprises. Currently, the safeguards against privacy threats in collaborations crossing organizational borders are both insufficient and incompatible to the open market. A broader understanding is needed of the architecture of defense structures, and privacy threats must be detected not only on the level of a private person or enterprise, but on the community and ecosystem levels as well. Control measures must be automated wherever possible in order to keep the cost and effort of collaboration management reasonable. This article contributes to the understanding of the modern inter-enterprise collaboration environment and privacy threats in it, and presents the automated control measures required to ensure that actors in inter-enterprise collaborations behave correctly to preserve privacy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The liquidity crisis that swept through the financial markets in 2007 triggered multi-billion losses and forced buyouts of some large banks. The resulting credit crunch is sometimes compared to the great recession in the early twentieth century. But the crisis also serves as a reminder of the significance of the interbank market and of proper central bank policy in this market. This thesis deals with implementation of monetary policy in the interbank market and examines how central bank tools affect commercial banks' decisions. I answer the following questions: • What is the relationship between the policy setup and interbank interest rate volatility? (averaging reserve requirement reduces the volatility) • What can explain a weak relationship between market liquidity and the interest rate? (high reserve requirement buffer) • What determines banks' decisions on when to satisfy the reserve requirement? (market frictions) • How did the liquidity crisis that began in 2007 affect interbank market behaviour? (resulted in higher credit risk and trading frictions as well as expected liquidity shortage)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Hodgkin and Huxley (HH) model of action potential has become a central paradigm of neuroscience. Despite its ability to predict action potentials with remarkable accuracy, it fails to explain several biophysical findings related to the initiation and propagation of the nerve impulse. The isentropic heat release and optical phenomena demonstrated by various experiments suggest that action potential is accompanied by a transient phase change in the axonal membrane. In this study a method was developed for preparing a giant axon from the crayfish abdominal cord for studying the molecular mechanisms of action potential simultaneously by electrophysiological and optical methods. Also an alternative setup using a single-cell culture of an Aplysia sensory neuron is presented. In addition to the description of the method, the preliminary results on the effect of phloretin, a dipole potential lowering compound, on the excitability of a crayfish giant axon are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gene expression is one of the most critical factors influencing the phenotype of a cell. As a result of several technological advances, measuring gene expression levels has become one of the most common molecular biological measurements to study the behaviour of cells. The scientific community has produced enormous and constantly increasing collection of gene expression data from various human cells both from healthy and pathological conditions. However, while each of these studies is informative and enlighting in its own context and research setup, diverging methods and terminologies make it very challenging to integrate existing gene expression data to a more comprehensive view of human transcriptome function. On the other hand, bioinformatic science advances only through data integration and synthesis. The aim of this study was to develop biological and mathematical methods to overcome these challenges and to construct an integrated database of human transcriptome as well as to demonstrate its usage. Methods developed in this study can be divided in two distinct parts. First, the biological and medical annotation of the existing gene expression measurements needed to be encoded by systematic vocabularies. There was no single existing biomedical ontology or vocabulary suitable for this purpose. Thus, new annotation terminology was developed as a part of this work. Second part was to develop mathematical methods correcting the noise and systematic differences/errors in the data caused by various array generations. Additionally, there was a need to develop suitable computational methods for sample collection and archiving, unique sample identification, database structures, data retrieval and visualization. Bioinformatic methods were developed to analyze gene expression levels and putative functional associations of human genes by using the integrated gene expression data. Also a method to interpret individual gene expression profiles across all the healthy and pathological tissues of the reference database was developed. As a result of this work 9783 human gene expression samples measured by Affymetrix microarrays were integrated to form a unique human transcriptome resource GeneSapiens. This makes it possible to analyse expression levels of 17330 genes across 175 types of healthy and pathological human tissues. Application of this resource to interpret individual gene expression measurements allowed identification of tissue of origin with 92.0% accuracy among 44 healthy tissue types. Systematic analysis of transcriptional activity levels of 459 kinase genes was performed across 44 healthy and 55 pathological tissue types and a genome wide analysis of kinase gene co-expression networks was done. This analysis revealed biologically and medically interesting data on putative kinase gene functions in health and disease. Finally, we developed a method for alignment of gene expression profiles (AGEP) to perform analysis for individual patient samples to pinpoint gene- and pathway-specific changes in the test sample in relation to the reference transcriptome database. We also showed how large-scale gene expression data resources can be used to quantitatively characterize changes in the transcriptomic program of differentiating stem cells. Taken together, these studies indicate the power of systematic bioinformatic analyses to infer biological and medical insights from existing published datasets as well as to facilitate the interpretation of new molecular profiling data from individual patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The management and coordination of business-process collaboration experiences changes because of globalization, specialization, and innovation. Service-oriented computing (SOC) is a means towards businessprocess automation and recently, many industry standards emerged to become part of the service-oriented architecture (SOA) stack. In a globalized world, organizations face new challenges for setting up and carrying out collaborations in semi-automating ecosystems for business services. For being efficient and effective, many companies express their services electronically in what we term business-process as a service (BPaaS). Companies then source BPaaS on the fly from third parties if they are not able to create all service-value inhouse because of reasons such as lack of reasoures, lack of know-how, cost- and time-reduction needs. Thus, a need emerges for BPaaS-HUBs that not only store service offers and requests together with information about their issuing organizations and assigned owners, but that also allow an evaluation of trust and reputation in an anonymized electronic service marketplace. In this paper, we analyze the requirements, design architecture and system behavior of such a BPaaS-HUB to enable a fast setup and enactment of business-process collaboration. Moving into a cloud-computing setting, the results of this paper allow system designers to quickly evaluate which services they need for instantiationg the BPaaS-HUB architecture. Furthermore, the results also show what the protocol of a backbone service bus is that allows a communication between services that implement the BPaaS-HUB. Finally, the paper analyzes where an instantiation must assign additional computing resources vor the avoidance of performance bottlenecks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to investigate powder and tablet behavior at the level of mechanical interactions between single particles. Various aspects of powder packing, mixing, compression, and bond formation were examined with the aid of computer simulations. The packing and mixing simulations were based on spring forces interacting between particles. Packing and breakage simulations included systems in which permanent bonds were formed and broken between particles, based on their interaction strengths. During the process, a new simulation environment based on Newtonian mechanics and elementary interactions between the particles was created, and a new method for evaluating mixing was developed. Powder behavior is a complicated process, and many of its aspects are still unclear. Powders as a whole exhibit some aspects of solids and others of liquids. Therefore, their physics is far from clear. However, using relatively simple models based on particle-particle interaction, many powder properties could be replicated during this work. Simulated packing densities were similar to values reported in the literature. The method developed for describing powder mixing correlated well with previous methods. The new method can be applied to determine mixing in completely homogeneous materials, without dividing them into different components. As such, it can describe the efficiency of the mixing method, regardless of the powder's initial setup. The mixing efficiency at different vibrations was examined, and we found that certain combinations of amplitude, direction, and frequencies resulted in better mixing while using less energy. Simulations using exponential force potentials between particles were able to explain the elementary compression behavior of tablets, and create force distributions that were similar to the pressure distributions reported in the literature. Tablet-breaking simulations resulted in breaking strengths that were similar to measured tablet breaking strengths. In general, many aspects of powder behavior can be explained with mechanical interactions at the particle level, and single particle properties can be reliably linked to powder behavior with accurate simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Powders are essential materials in the pharmaceutical industry, being involved in majority of all drug manufacturing. Powder flow and particle size are central particle properties addressed by means of particle engineering. The aim of the thesis was to gain knowledge on powder processing with restricted liquid addition, with a primary focus on particle coating and early granule growth. Furthermore, characterisation of this kind of processes was performed. A thin coating layer of hydroxypropyl methylcellulose was applied on individual particles of ibuprofen in a fluidised bed top-spray process. The polymeric coating improved the flow properties of the powder. The improvement was strongly related to relative humidity, which can be seen as an indicator of a change in surface hydrophilicity caused by the coating. The ibuprofen used in the present study had a d50 of 40 μm and thus belongs to the Geldart group C powders, which can be considered as challenging materials in top-spray coating processes. Ibuprofen was similarly coated using a novel ultrasound-assisted coating method. The results were in line with those obtained from powders coated in the fluidised bed process mentioned above. It was found that the ultrasound-assisted method was capable of coating single particles with a simple and robust setup. Granule growth in a fluidised bed process was inhibited by feeding the liquid in pulses. The results showed that the length of the pulsing cycles is of importance, and can be used to adjust granule growth. Moreover, pulsed liquid feed was found to be of greater significance to granule growth in high inlet air relative humidity. Liquid feed pulsing can thus be used as a tool in particle size targeting in fluidised bed processes and in compensating for changes in relative humidity of the inlet air. The nozzle function of a two-fluid external mixing pneumatic nozzle, typical for small scale pharmaceutical fluidised bed processes, was studied in situ in an ongoing fluidised bed process with particle tracking velocimetry. It was found that the liquid droplets undergo coalescence as they proceed away from the nozzle head. The coalescence was expected to increase droplet speed, which was confirmed in the study. The spray turbulence was studied, and the results showed turbulence caused by the event of atomisation and by the oppositely directed fluidising air. It was concluded that particle tracking velocimetry is a suitable tool for in situ spray characterisation. The light transmission through dense particulate systems was found to carry information on particle size and packing density as expected based on the theory of light scattering by solids. It was possible to differentiate binary blends consisting of components with differences in optical properties. Light transmission showed potential as a rapid, simple and inexpensive tool in characterisation of particulate systems giving information on changes in particle systems, which could be utilised in basic process diagnostics.