894 resultados para Design and Analysis of Compute Experiment (DACE)
Resumo:
During the last half decade the popularity of different peer-to-peer applications has grown tremendously. Traditionally only desktop-class computers with fixed line network connections have been powerful enough to utilize peer-to-peer. However, the situation is about to change. The rapid development of wireless terminals will soon enable peer-to-peer applications on these devices as well as on desktops. Possibilities are further enhanced by the upcoming high-bandwidth cellular networks. In this thesis the applicability and implementation alternatives of an existing peer-to-peer system are researched for two target platforms: Linux powered iPaq and Symbian OS based smartphone. The result is a peer-to-peer middleware component suitable for mobile terminals. It works on both platforms and utilizes Bluetooth networking technology. The implemented software platforms are compatible with each other and support for additional network technologies can be added with a minimal effort.
Resumo:
Nokia Push To Talk järjestelmä tarjoaa uuden kommunikointimetodin tavallisen puhelun oheen. Yksi tärkeimmistä uuden järjestelmän ominaisuuksista on puhelunmuodostuksen nopeus. Lisäksi järjestelmän tulee olla telekommunikaatiojärjestelmien yleisten periaatteiden mukainen, mahdollisimman stabiili ja skaalautuva, jotta järjestelmä olisi mahdollisimman vikasietoinen ja laajennettavissa. Diplomityön päätavoite on esitellä "C++"-tietokantakirjastojen suunnittelua ja testausta. Aluksi tutkitaan tietokantajärjestelmien problematiikkaa alkaen tietokantajärjestelmän valinnasta ja huomioiden erityisesti nopeuskriteerit. Sitten esitellään kaksi teknistä toteutusta kahta "C++"-tietokantakirjastoa varten ja pohditaan joitakin vaihtoehtoisia toteutustapoja.
Resumo:
Teollusuussovelluksissa vaaditaan nykyisin yhä useammin reaaliaikaista tiedon käsittelyä. Luotettavuus on yksi tärkeimmistä reaaliaikaiseen tiedonkäsittelyyn kykenevän järjestelmän ominaisuuksista. Sen saavuttamiseksi on sekä laitteisto, että ohjelmisto testattava. Tämän työn päätavoitteena on laitteiston testaaminen ja laitteiston testattavuus, koska luotettava laitteistoalusta on perusta tulevaisuuden reaaliaikajärjestelmille. Diplomityössä esitetään digitaaliseen signaalinkäsittelyyn soveltuvan prosessorikortin suunnittelu. Prosessorikortti on tarkoitettu sähkökoneiden ennakoivaa kunnonvalvontaa varten. Uusimmat DFT (Desing for Testability) menetelmät esitellään ja niitä sovelletaan prosessorikortin sunnittelussa yhdessä vanhempien menetelmien kanssa. Kokemukset ja huomiot menetelmien soveltuvuudesta raportoidaan työn lopussa. Työn tavoitteena on kehittää osakomponentti web -pohjaiseen valvontajärjestelmään, jota on kehitetty Sähkötekniikan osastolla Lappeenrannan teknillisellä korkeakoululla.
Resumo:
The aim of this study is to define a new statistic, PVL, based on the relative distance between the likelihood associated with the simulation replications and the likelihood of the conceptual model. Our results coming from several simulation experiments of a clinical trial show that the PVL statistic range can be a good measure of stability to establish when a computational model verifies the underlying conceptual model. PVL improves also the analysis of simulation replications because only one statistic is associated with all the simulation replications. As well it presents several verification scenarios, obtained by altering the simulation model, that show the usefulness of PVL. Further simulation experiments suggest that a 0 to 20 % range may define adequate limits for the verification problem, if considered from the viewpoint of an equivalence test.
Resumo:
We have designed and validated a novel generic platform for production of tetravalent IgG1-like chimeric bispecific Abs. The VH-CH1-hinge domains of mAb2 are fused through a peptidic linker to the N terminus of mAb1 H chain, and paired mutations at the CH1-CL interface mAb1 are introduced that force the correct pairing of the two different free L chains. Two different sets of these CH1-CL interface mutations, called CR3 and MUT4, were designed and tested, and prototypic bispecific Abs directed against CD5 and HLA-DR were produced (CD5xDR). Two different hinge sequences between mAb1 and mAb2 were also tested in the CD5xDR-CR3 or -MUT4 background, leading to bispecific Ab (BsAbs) with a more rigid or flexible structure. All four Abs produced bound with good specificity and affinity to CD5 and HLA-DR present either on the same target or on different cells. Indeed, the BsAbs were able to efficiently redirect killing of HLA-DR(+) leukemic cells by human CD5(+) cytokine-induced killer T cells. Finally, all BsAbs had a functional Fc, as shown by their capacity to activate human complement and NK cells and to mediate phagocytosis. CD5xDR-CR3 was chosen as the best format because it had overall the highest functional activity and was very stable in vitro in both neutral buffer and in serum. In vivo, CD5xDR-CR3 was shown to have significant therapeutic activity in a xenograft model of human leukemia.
Resumo:
Background: Pulseless electrical activity (PEA) cardiac arrest is defined as a cardiac arrest (CA) presenting with a residual organized electrical activity on the electrocardiogram. In the last decades, the incidence of PEA has regularly increased, compared to other types of CA like ventricular fibrillation or pulseless ventricular tachycardia. PEA is frequently induced by reversible conditions. The "4 (or 5) H" & "4 (or 5) T" are proposed as a mnemonic to asses for Hypoxia, Hypovolemia, Hypo- /Hyperkalaemia, Hypothermia, Thrombosis (cardiac or pulmonary), cardiac Tamponade, Toxins, and Tension pneumothorax. Other pathologies (intracranial haemorrhage, severe sepsis, myocardial contraction dysfunction) have been identified as potential causes for PEA, but their respective probability and frequencies are unclear and they are not yet included into the resuscitation guidelines. The aim of this study was to analyse the aetiologies of PEA out-of-hospital CA, in order to evaluate the relative frequencies of each cause and therefore to improve the management of patients suffering a PEA cardiac arrest. Method: This retrospective study was based on data routinely and prospectively collected for each PEMS intervention. All adult patients treated from January 1st 2002 to December 2012 31st by the PEMS for out-of-hospital cardiac arrest, with PEA as the first recorded rhythm, and admitted to the emergency department (ED) of the Lausanne University Hospital were included. The aetiologies of PEA cardiac arrest were classified into subgroups, based on the classical H&T's classification, supplemented by four other subgroups analysis: trauma, intra-cranial haemorrhage (ICH), non-ischemic cardiomyopathy (NIC) and undetermined cause. Results: 1866 OHCA were treated by the PEMS. PEA was the first recorded rhythm in 240 adult patients (13.8 %). After exclusion of 96 patients, 144 patients with a PEA cardiac arrest admitted to the ED were included in the analysis. The mean age was 63.8 ± 20.0 years, 58.3% were men and the survival rate at 48 hours was 29%. 32 different causes of OHCA PEA were established for 119 patients. For 25 patients (17.4 %), we were unable to attribute a specific cause for the PEA cardiac arrest. Hypoxia (23.6 %), acute coronary syndrome (12.5%) and trauma (12.5 %) were the three most frequent causes. Pulmonary embolism, Hypovolemia, Intoxication and Hyperkaliemia occurs in less than 10% of the cases (7.6 %, 5.6 %, 3.5%, respectively 2.1 %). Non ischemic cardiomyopathy and intra-cranial haemorrhage occur in 8.3 % and 6.9 %, respectively. Conclusions: According to our results, intra-cranial haemorrhage and non-ischemic cardiomyopathy represent noticeable causes of PEA in OHCA, with a prevalence equalling or exceeding the frequency of classical 4 H's and 4 T's aetiologies. These two pathologies are potentially accessible to simple diagnostic procedures (native CT-scan or echocardiography) and should be included into the 4 H's and 4 T's mnemonic.
Resumo:
Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.
Resumo:
In many industrial applications, accurate and fast surface reconstruction is essential for quality control. Variation in surface finishing parameters, such as surface roughness, can reflect defects in a manufacturing process, non-optimal product operational efficiency, and reduced life expectancy of the product. This thesis considers reconstruction and analysis of high-frequency variation, that is roughness, on planar surfaces. Standard roughness measures in industry are calculated from surface topography. A fast and non-contact method to obtain surface topography is to apply photometric stereo in the estimation of surface gradients and to reconstruct the surface by integrating the gradient fields. Alternatively, visual methods, such as statistical measures, fractal dimension and distance transforms, can be used to characterize surface roughness directly from gray-scale images. In this thesis, the accuracy of distance transforms, statistical measures, and fractal dimension are evaluated in the estimation of surface roughness from gray-scale images and topographies. The results are contrasted to standard industry roughness measures. In distance transforms, the key idea is that distance values calculated along a highly varying surface are greater than distances calculated along a smoother surface. Statistical measures and fractal dimension are common surface roughness measures. In the experiments, skewness and variance of brightness distribution, fractal dimension, and distance transforms exhibited strong linear correlations to standard industry roughness measures. One of the key strengths of photometric stereo method is the acquisition of higher frequency variation of surfaces. In this thesis, the reconstruction of planar high-frequency varying surfaces is studied in the presence of imaging noise and blur. Two Wiener filterbased methods are proposed of which one is optimal in the sense of surface power spectral density given the spectral properties of the imaging noise and blur. Experiments show that the proposed methods preserve the inherent high-frequency variation in the reconstructed surfaces, whereas traditional reconstruction methods typically handle incorrect measurements by smoothing, which dampens the high-frequency variation.
Resumo:
Particulate nanostructures are increasingly used for analytical purposes. Such particles are often generated by chemical synthesis from non-renewable raw materials. Generation of uniform nanoscale particles is challenging and particle surfaces must be modified to make the particles biocompatible and water-soluble. Usually nanoparticles are functionalized with binding molecules (e.g., antibodies or their fragments) and a label substance (if needed). Overall, producing nanoparticles for use in bioaffinity assays is a multistep process requiring several manufacturing and purification steps. This study describes a biological method of generating functionalized protein-based nanoparticles with specific binding activity on the particle surface and label activity inside the particles. Traditional chemical bioconjugation of the particle and specific binding molecules is replaced with genetic fusion of the binding molecule gene and particle backbone gene. The entity of the particle shell and binding moieties are synthesized from generic raw materials by bacteria, and fermentation is combined with a simple purification method based on inclusion bodies. The label activity is introduced during the purification. The process results in particles that are ready-to-use as reagents in bioaffinity. Apoferritin was used as particle body and the system was demonstrated using three different binding moieties: a small protein, a peptide and a single chain Fv antibody fragment that represents a complex protein including disulfide bridge.If needed, Eu3+ was used as label substance. The results showed that production system resulted in pure protein preparations, and the particles were of homogeneous size when visualized with transmission electron microscopy. Passively introduced label was stably associated with the particles, and binding molecules genetically fused to the particle specifically bound target molecules. Functionality of the particles in bioaffinity assays were successfully demonstrated with two types of assays; as labels and in particle-enhanced agglutination assay. This biological production procedure features many advantages that make the process especially suited for applications that have frequent and recurring requirements for homogeneous functional particles. The production process of ready, functional and watersoluble particles follows principles of “green chemistry”, is upscalable, fast and cost-effective.
Resumo:
An evaluation of the performance of a continuous flow hydride generator-nebulizer for flame atomic absorption spectrometry was carried out. Optimization of nebulizer gas flow rate, sample acid concentration, sample and tetrahydroborate uptake rates and reductant concentration, on the As and Se absorbance signals was carried out. A hydrogen-argon flame was used. An improvement of the analytical sensitivity relative to the conventional bead nebulizer used in flame AA was obtained (2 (As) and 4.8 (Se) µg L-1). Detection limits (3σb) of 1 (As) and 1.3 (Se) µg L-1 were obtained. Accuracy of the method was checked by analyzing an oyster tissue reference material.
Resumo:
Depression is a major cause of disability and disease with significant costs to the health system and for the whole society. Regarding the treatment, in recent years has questioned the effectiveness of antidepressant drugs, with a recognition that although depressive disorders tend to improve with these treatments, residual symptoms seems to be still the norm, which is associated with the risk of new episodes or relapses, and faster its appearance. Otherwise many of the specialized clinical guidelines, propose a based on stepped-care model intervention, prioritizing less intrusive actions, including low-intensity psychosocial-interventions.
Resumo:
In the theoretical part, the different polymerisation catalysts are introduced and the phenomena related to mixing in the stirred tank reactor are presented. Also the advantages and challenges related to scale-up are discussed. The aim of the applied part was to design and implement an intermediate-sized reactor useful for scale-up studies. The reactor setting was tested making one batch of Ziegler–Natta polypropylene catalyst. The catalyst preparation with a designed equipment setting succeeded and the catalyst was analysed. The analyses of the catalyst were done, because the properties of the catalyst were compared to the normal properties of Ziegler–Natta polypropylene catalyst. The total titanium content of the catalyst was slightly higher than in normal Ziegler–Natta polypropylene catalyst, but the magnesium and aluminium content of the catalyst were in the normal level. By adjusting the siphonation tube and adding one washing step the titanium content of the catalyst could be decreased. The particle size of the catalyst was small, but the activity was in a normal range. The size of the catalyst particles could be increased by decreasing the stirring speed. During the test run, it was noticed that some improvements for the designed equipment setting could be done. For example more valves for the chemical feed line need to be added to ensure inert conditions during the catalyst preparation. Also nitrogen for the reactor needs to separate from other nitrogen line. With this change the pressure in the reactor can be kept as desired during the catalyst preparation. The proposals for improvements are presented in the applied part. After these improvements are done, the equipment setting is ready for start-up. The computational fluid dynamics model for the designed reactor was provided by cooperation with Lappeenranta University of Technology. The experiments showed that for adequate mixing with one impeller, stirring speed of 600 rpm is needed. The computational fluid dynamics model with two impellers showed that there was no difference in the mixing efficiency if the upper impeller were pumping downwards or upwards.
Resumo:
IT outsourcing refers to the way companies focus on their core competencies and buy the supporting functions from other companies specialized in that area. Service is the total outcome of numerous of activities by employees and other resources to provide solutions to customers' problems. Outsourcing and service business have their unique characteristics. Service Level Agreements quantify the minimum acceptable service to the user. The service quality has to be objectively quantified so that its achievement or non-achievement of it can be monitored. Usually offshoring refers to the transferring of tasks to low-cost nations. Offshoring presents a lot of challenges that require special attention and they need to be assessed thoroughly. IT Infrastructure management refers to installation and basic usability assistance of operating systems, network and server tools and utilities. ITIL defines the industry best practices for organizing IT processes. This thesis did an analysis of server operations service and the customers’ perception of the quality of daily operations. The agreed workflows and processes should be followed better. Service providers’ processes are thoroughly defined but both the customer and the service provider might disobey them. Service provider should review the workflows regarding customer functions. Customer facing functions require persistent skill development, as they communicate the quality to the customer. Service provider needs to provide better organized communication and knowledge exchange methods between the specialists in different geographical locations.
Resumo:
The development of software tools begun as the first computers were built. The current generation of development environments offers a common interface to access multiple software tools and often also provide a possibility to build custom tools as extensions to the existing development environment. Eclipse is an open source development environment that offers good starting point for developing custom extensions. This thesis presents a software tool to aid the development of context-aware applications on Multi-User Publishing Environment (MUPE) platform. The tool is implemented as an Eclipse plug-in. The tool allows developer to include external server side contexts to their MUPE applications. The tool allows additional context sources to be added through the Eclipse's extension point mechanism. The thesis describes how the tool was designed and implemented. The implementation consists of tool core component part and an additional context source extension part. Tool core component is responsible for the actual context addition and also provides the needed user interface elements to the Eclipse workbench. Context source component provides the needed context source related information to the core component. As part of the work an update site feature was also implemented for distributing the tool through Eclipse update mechanism.
Resumo:
Parameters such as tolerance, scale and agility utilized in data sampling for using in Precision Agriculture required an expressive number of researches and development of techniques and instruments for automation. It is highlighted the employment of methodologies in remote sensing used in coupled to a Geographic Information System (GIS), adapted or developed for agricultural use. Aiming this, the application of Agricultural Mobile Robots is a strong tendency, mainly in the European Union, the USA and Japan. In Brazil, researches are necessary for the development of robotics platforms, serving as a basis for semi-autonomous and autonomous navigation systems. The aim of this work is to describe the project of an experimental platform for data acquisition in field for the study of the spatial variability and development of agricultural robotics technologies to operate in agricultural environments. The proposal is based on a systematization of scientific work to choose the design parameters utilized for the construction of the model. The kinematic study of the mechanical structure was made by the virtual prototyping process, based on modeling and simulating of the tension applied in frame, using the.