10 resultados para desig automation of robots

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Telomeres have emerged as crucial cellular elements in aging and various diseases including cancer. To measure the average length of telomere repeats in cells, we describe our protocols that use fluorescent in situ hybridization (FISH) with labeled peptide nucleic acid (PNA) probes specific for telomere repeats in combination with fluorescence measurements by flow cytometry (flow FISH). Flow FISH analysis can be performed using commercially available flow cytometers, and has the unique advantage over other methods for measuring telomere length of providing multi-parameter information on the length of telomere repeats in thousands of individual cells. The accuracy and reproducibility of the measurements is augmented by the automation of most pipetting (aspiration and dispensing) steps, and by including an internal standard (control cells) with a known telomere length in every tube. The basic protocol for the analysis of nucleated blood cells from 22 different individuals takes about 12 h spread over 2-3 days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we follow a theory-based approach to study the assimilation of compliance software in highly regulated multinational enterprises. These relatively new software products support the automation of controls which are associated with mandatory compliance requirements. We use institutional and success factor theories to explain the assimilation of compliance software. A framework for analyzing the assimilation of Access Control Systems (ACS), a special type of compliance software, is developed and used to reflect the experiences obtained in four in-depth case studies. One result is that coercive, mimetic, and normative pressures significantly effect ACS assimilation. On the other hand, quality aspects have only a moderate impact at the beginning of the assimilation process, in later phases the impact may increase if performance and improvement objectives become more relevant. In addition, it turns out that position of the enterprises and compatibility heavily influence the assimilation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two-dimensional (2D) crystallisation of Membrane proteins reconstitutes them into their native environment, the lipid bilayer. Electron crystallography allows the structural analysis of these regular protein–lipid arrays up to atomic resolution. The crystal quality depends on the protein purity, ist stability and on the crystallisation conditions. The basics of 2D crystallisation and different recent advances are reviewed and electron crystallography approaches summarised. Progress in 2D crystallisation, sample preparation, image detectors and automation of the data acquisition and processing pipeline makes 2D electron crystallography particularly attractive for the structural analysis of membrane proteins that are too small for single-particle analyses and too unstable to form three-dimensional (3D) crystals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2010 more than 600 radiocarbon samples were measured with the gas ion source at the MIni CArbon DAting System (MICADAS) at ETH Zurich and the number of measurements is rising quickly. While most samples contain less than 50 mu g C at present, the gas ion source is attractive as well for larger samples because the time-consuming graphitization is omitted. Additionally, modern samples are now measured down to 5 per-mill counting statistics in less than 30 min with the recently improved gas ion source. In the versatile gas handling system, a stepping-motor-driven syringe presses a mixture of helium and sample CO2 into the gas ion source, allowing continuous and stable measurements of different kinds of samples. CO2 can be provided in four different ways to the versatile gas interface. As a primary method. CO2 is delivered in glass or quartz ampoules. In this case, the CO2 is released in an automated ampoule cracker with 8 positions for individual samples. Secondly, OX-1 and blank gas in helium can be provided to the syringe by directly connecting gas bottles to the gas interface at the stage of the cracker. Thirdly, solid samples can be combusted in an elemental analyzer or in a thermo-optical OC/EC aerosol analyzer where the produced CO2 is transferred to the syringe via a zeolite trap for gas concentration. As a fourth method, CO2 is released from carbonates with phosphoric acid in septum-sealed vials and loaded onto the same trap used for the elemental analyzer. All four methods allow complete automation of the measurement, even though minor user input is presently still required. Details on the setup, versatility and applications of the gas handling system are given. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Cloud computing service emerged as an essential component of the Enterprise {IT} infrastructure. Migration towards a full range and large-scale convergence of Cloud and network services has become the current trend for addressing requirements of the Cloud environment. Our approach takes the infrastructure as a service paradigm to build converged virtual infrastructures, which allow offering tailored performance and enable multi-tenancy over a common physical infrastructure. Thanks to virtualization, new exploitation activities of the physical infrastructures may arise for both transport network and Data Centres services. This approach makes network and Data Centres’ resources dedicated to Cloud Computing to converge on the same flexible and scalable level. The work presented here is based on the automation of the virtual infrastructure provisioning service. On top of the virtual infrastructures, a coordinated operation and control of the different resources is performed with the objective of automatically tailoring connectivity services to the Cloud service dynamics. Furthermore, in order to support elasticity of the Cloud services through the optical network, dynamic re-planning features have been provided to the virtual infrastructure service, which allows scaling up or down existing virtual infrastructures to optimize resource utilisation and dynamically adapt to users’ demands. Thus, the dynamic re-planning of the service becomes key component for the coordination of Cloud and optical network resource in an optimal way in terms of resource utilisation. The presented work is complemented with a use case of the virtual infrastructure service being adopted in a distributed Enterprise Information System, that scales up and down as a function of the application requests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A CE system featuring an array of 16 contactless conductivity detectors was constructed. The detectors were arranged along 70 cm length of a capillary with 100 cm total length and allow the monitoring of separation processes. As the detectors cannot be accommodated on a conventional commercial instrument, a purpose built set-up employing a sequential injection manifold had to be employed for automation of the fluid handling. Conductivity measurements can be considered universal for electrophoresis and thus any changes in ionic composition can be monitored. The progress of the separation of Na(+) and K(+) is demonstrated. The potential of the system to the study of processes in CZE is shown in two examples. The first demonstrates the differences in the developments of peaks originating from a sample plug with a purely aqueous background to that of a plug containing the analyte ions in the buffer. The second example visualizes the opposite migration of cations and anions from a sample plug that had been placed in the middle of the capillary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Organizing and archiving statistical results and processing a subset of those results for publication are important and often underestimated issues in conducting statistical analyses. Because automation of these tasks is often poor, processing results produced by statistical packages is quite laborious and vulnerable to error. I will therefore present a new package called estout that facilitates and automates some of these tasks. This new command can be used to produce regression tables for use with spreadsheets, LaTeX, HTML, or word processors. For example, the results for multiple models can be organized in spreadsheets and can thus be archived in an orderly manner. Alternatively, the results can be directly saved as a publication-ready table for inclusion in, for example, a LaTeX document. estout is implemented as a wrapper for estimates table but has many additional features, such as support for mfx. However, despite its flexibility, estout is—I believe—still very straightforward and easy to use. Furthermore, estout can be customized via so-called defaults files. A tool to make available supplementary statistics called estadd is also provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Postestimation processing and formatting of regression estimates for input into document tables are tasks that many of us have to do. However, processing results by hand can be laborious, and is vulnerable to error. There are therefore many benefits to automation of these tasks while at the same time retaining user flexibility in terms of output format. The estout package meets these needs. estout assembles a table of coefficients, "significance stars", summary statistics, standard errors, t/z statistics, p-values, confidence intervals, and other statistics calculated for up to twenty models previously fitted and stored by estimates store. It then writes the table to the Stata log and/or to a text file. The estimates are formatted optionally in several styles: html, LaTeX, or tab-delimited (for input into MS Excel or Word). There are a large number of options regarding which output is formatted and how. This talk will take users through a range of examples, from relatively basic simple applications to complex ones.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Localization is information of fundamental importance to carry out various tasks in the mobile robotic area. The exact degree of precision required in the localization depends on the nature of the task. The GPS provides global position estimation but is restricted to outdoor environments and has an inherent imprecision of a few meters. In indoor spaces, other sensors like lasers and cameras are commonly used for position estimation, but these require landmarks (or maps) in the environment and a fair amount of computation to process complex algorithms. These sensors also have a limited field of vision. Currently, Wireless Networks (WN) are widely available in indoor environments and can allow efficient global localization that requires relatively low computing resources. However, the inherent instability in the wireless signal prevents it from being used for very accurate position estimation. The growth in the number of Access Points (AP) increases the overlap signals areas and this could be a useful means of improving the precision of the localization. In this paper we evaluate the impact of the number of Access Points in mobile nodes localization using Artificial Neural Networks (ANN). We use three to eight APs as a source signal and show how the ANNs learn and generalize the data. Added to this, we evaluate the robustness of the ANNs and evaluate a heuristic to try to decrease the error in the localization. In order to validate our approach several ANNs topologies have been evaluated in experimental tests that were conducted with a mobile node in an indoor space.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Early intervention and intensive therapy improve the outcome of neuromuscular rehabilitation. There are indications that where a patient is motivated and premeditates their movement, the recovery is more effective. Therefore, a strategy for patient-cooperative control of rehabilitation devices for upper extremities is proposed and evaluated. The strategy is based on the minimal intervention principle allowing an efficient exploitation of task space redundancies and resulting in user-driven movement trajectories. The patient's effort is taken into consideration by enabling the machine to comply with forces exerted by the user. The interaction is enhanced through a multimodal display and a virtually generated environment that includes haptic, visual and sound modalities.