16 resultados para Process control -- Data processing

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Short-acting agents for neuromuscular block (NMB) require frequent dosing adjustments for individual patient's needs. In this study, we verified a new closed-loop controller for mivacurium dosing in clinical trials. METHODS: Fifteen patients were studied. T1% measured with electromyography was used as input signal for the model-based controller. After induction of propofol/opiate anaesthesia, stabilization of baseline electromyography signal was awaited and a bolus of 0.3 mg kg-1 mivacurium was then administered to facilitate endotracheal intubation. Closed-loop infusion was started thereafter, targeting a neuromuscular block of 90%. Setpoint deviation, the number of manual interventions and surgeon's complaints were recorded. Drug use and its variability between and within patients were evaluated. RESULTS: Median time of closed-loop control for the 11 patients included in the data processing was 135 [89-336] min (median [range]). Four patients had to be excluded because of sensor problems. Mean absolute deviation from setpoint was 1.8 +/- 0.9 T1%. Neither manual interventions nor complaints from the surgeons were recorded. Mean necessary mivacurium infusion rate was 7.0 +/- 2.2 microg kg-1 min-1. Intrapatient variability of mean infusion rates over 30-min interval showed high differences up to a factor of 1.8 between highest and lowest requirement in the same patient. CONCLUSIONS: Neuromuscular block can precisely be controlled with mivacurium using our model-based controller. The amount of mivacurium needed to maintain T1% at defined constant levels differed largely between and within patients. Closed-loop control seems therefore advantageous to automatically maintain neuromuscular block at constant levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of re-sampling spatially distributed data organized into regular or irregular grids to finer or coarser resolution is a common task in data processing. This procedure is known as 'gridding' or 're-binning'. Depending on the quantity the data represents, the gridding-algorithm has to meet different requirements. For example, histogrammed physical quantities such as mass or energy have to be re-binned in order to conserve the overall integral. Moreover, if the quantity is positive definite, negative sampling values should be avoided. The gridding process requires a re-distribution of the original data set to a user-requested grid according to a distribution function. The distribution function can be determined on the basis of the given data by interpolation methods. In general, accurate interpolation with respect to multiple boundary conditions of heavily fluctuating data requires polynomial interpolation functions of second or even higher order. However, this may result in unrealistic deviations (overshoots or undershoots) of the interpolation function from the data. Accordingly, the re-sampled data may overestimate or underestimate the given data by a significant amount. The gridding-algorithm presented in this work was developed in order to overcome these problems. Instead of a straightforward interpolation of the given data using high-order polynomials, a parametrized Hermitian interpolation curve was used to approximate the integrated data set. A single parameter is determined by which the user can control the behavior of the interpolation function, i.e. the amount of overshoot and undershoot. Furthermore, it is shown how the algorithm can be extended to multidimensional grids. The algorithm was compared to commonly used gridding-algorithms using linear and cubic interpolation functions. It is shown that such interpolation functions may overestimate or underestimate the source data by about 10-20%, while the new algorithm can be tuned to significantly reduce these interpolation errors. The accuracy of the new algorithm was tested on a series of x-ray CT-images (head and neck, lung, pelvis). The new algorithm significantly improves the accuracy of the sampled images in terms of the mean square error and a quality index introduced by Wang and Bovik (2002 IEEE Signal Process. Lett. 9 81-4).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article proposes granular computing as a theoretical, formal and methodological basis for the newly emerging research field of human–data interaction (HDI). We argue that the ability to represent and reason with information granules is a prerequisite for data legibility. As such, it allows for extending the research agenda of HDI to encompass the topic of collective intelligence amplification, which is seen as an opportunity of today’s increasingly pervasive computing environments. As an example of collective intelligence amplification in HDI, we introduce a collaborative urban planning use case in a cognitive city environment and show how an iterative process of user input and human-oriented automated data processing can support collective decision making. As a basis for automated human-oriented data processing, we use the spatial granular calculus of granular geometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Navigation of deep space probes is most commonly operated using the spacecraft Doppler tracking technique. Orbital parameters are determined from a series of repeated measurements of the frequency shift of a microwave carrier over a given integration time. Currently, both ESA and NASA operate antennas at several sites around the world to ensure the tracking of deep space probes. Just a small number of software packages are nowadays used to process Doppler observations. The Astronomical Institute of the University of Bern (AIUB) has recently started the development of Doppler data processing capabilities within the Bernese GNSS Software. This software has been extensively used for Precise Orbit Determination of Earth orbiting satellites using GPS data collected by on-board receivers and for subsequent determination of the Earth gravity field. In this paper, we present the currently achieved status of the Doppler data modeling and orbit determination capabilities in the Bernese GNSS Software using GRAIL data. In particular we will focus on the implemented orbit determination procedure used for the combined analysis of Doppler and intersatellite Ka-band data. We show that even at this earlier stage of the development we can achieve an accuracy of few mHz on two-way S-band Doppler observation and of 2 µm/s on KBRR data from the GRAIL primary mission phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tick-borne encephalitis (TBE), a viral infection of the central nervous system, is endemic in many Eurasian countries. In Switzerland, TBE risk areas have been characterized by geographic mapping of clinical cases. Since mass vaccination should significantly decrease the number of TBE cases, alternative methods for exposure risk assessment are required. We established a new PCR-based test for the detection of TBE virus (TBEV) in ticks. The protocol involves an automated, high-throughput nucleic acid extraction method (QIAsymphony SP system) and a one-step duplex real-time reverse transcription-PCR (RT-PCR) assay for the detection of European subtype TBEV, including an internal process control. High usability, reproducibility, and equivalent performance for virus concentrations down to 5 x 10(3) viral genome equivalents/microl favor the automated protocol compared to the modified guanidinium thiocyanate-phenol-chloroform extraction procedure. The real-time RT-PCR allows fast, sensitive (limit of detection, 10 RNA copies/microl), and specific (no false-positive test results for other TBEV subtypes, other flaviviruses, or other tick-transmitted pathogens) detection of European subtype TBEV. The new detection method was applied in a national surveillance study, in which 62,343 Ixodes ricinus ticks were screened for the presence of TBE virus. A total of 38 foci of endemicity could be identified, with a mean virus prevalence of 0.46%. The foci do not fully agree with those defined by disease mapping. Therefore, the proposed molecular test procedure constitutes a prerequisite for an appropriate TBE surveillance. Our data are a unique complement of human TBE disease case mapping in Switzerland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background The release of quality data from acute care hospitals to the general public is based on the aim to inform the public, to provide transparency and to foster quality-based competition among providers. Due to the expected mechanisms of action and possibly the adverse consequences of public quality comparison, it is a controversial topic. The perspective of physicians and nurses is of particular importance in this context. They are mainly responsible for the collection of quality-control data, and are directly confronted with the results of public comparison. The research focus of this qualitative study was to discover what the views and opinions of the Swiss physicians and nurses were regarding these issues. It was investigated as to how the two professional groups appraised the opportunities as well as the risks of the release of quality data in Switzerland. Methods A qualitative approach was chosen to answer the research question. For data collection, four focus groups were conducted with physicians and nurses who were employed in Swiss acute care hospitals. Qualitative content analysis was applied to the data. Results The results revealed that both occupational groups had a very critical and negative attitude regarding the recent developments. The perceived risks were dominating their view. In summary, their main concerns were: the reduction of complexity, the one-sided focus on measurable quality variables, risk selection, the threat of data manipulation and the abuse of published information by the media. An additional concern was that the impression is given that the complex construct of quality can be reduced to a few key figures, and it that it is constructed from a false message which then influences society and politics. This critical attitude is associated with the different value system and the professional self-concept that both physicians and nurses have, in comparison to the underlying principles of a market-based economy and the economic orientation of health care business. Conclusions The critical and negative attitude of Swiss physicians and nurses must, under all conditions, be heeded to and investigated regarding its impact on work motivation and identification with the profession. At the same time, the two professional groups are obligated to reflect upon their critical attitude and take a proactive role in the development of appropriate quality indicators for the publication of quality data in Switzerland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surgery and other invasive therapies are complex interventions, the assessment of which is challenged by factors that depend on operator, team, and setting, such as learning curves, quality variations, and perception of equipoise. We propose recommendations for the assessment of surgery based on a five-stage description of the surgical development process. We also encourage the widespread use of prospective databases and registries. Reports of new techniques should be registered as a professional duty, anonymously if necessary when outcomes are adverse. Case series studies should be replaced by prospective development studies for early technical modifications and by prospective research databases for later pre-trial evaluation. Protocols for these studies should be registered publicly. Statistical process control techniques can be useful in both early and late assessment. Randomised trials should be used whenever possible to investigate efficacy, but adequate pre-trial data are essential to allow power calculations, clarify the definition and indications of the intervention, and develop quality measures. Difficulties in doing randomised clinical trials should be addressed by measures to evaluate learning curves and alleviate equipoise problems. Alternative prospective designs, such as interrupted time series studies, should be used when randomised trials are not feasible. Established procedures should be monitored with prospective databases to analyse outcome variations and to identify late and rare events. Achievement of improved design, conduct, and reporting of surgical research will need concerted action by editors, funders of health care and research, regulatory bodies, and professional societies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the long run, the widespread use of slide scanners by pathologists requires an adaptation of teaching methods in histology and cytology in order to target these new possibilities of image processing and presentation via the internet. Accordingly, we were looking for a tool with the possibility to teach microscopic anatomy, histology, and cytology of tissue samples which would be able to combine image data from light and electron microscopes independently of microscope suppliers. With the example of a section through the villus of jejunum, we describe here how to process image data from light and electron microscopes in order to get one image-stack which allows a correlation of structures from the microscopic anatomic to the cytological level. With commercially available image-presentation software that we adapted to our needs, we present here a platform which allows for the presentation of this new but also of older material independently of microscope suppliers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Volunteers are the most important resource for non-profit sport clubs seeking to bolster their viability (e.g. sporting programs). Although many people do voluntary work in sport clubs, stable voluntary engagement can no longer be granted. This difficulty is confirmed by existing research across various European countries. From a club management point of view, a detailed understanding of how to attract volunteers and retain them in the long term is becoming a high priority. The purpose of this study is (1) to analyse the influence of individual characteristics and corresponding organisational conditions on volunteering in sports clubs as well as (2) to examine the decision-making processes in relation to implement effective strategies for recruiting volunteers. For the first perspective a multi-level framework for the investigation of the factors of voluntary engagement in sports clubs is developed. The individual and context factors are estimated in different multi-level models based on a sample of n = 1,434 sport club members from 36 sport clubs in Switzerland. Results indicate that volunteering is not just an outcome of individual characteristics such as lower workloads, higher income, children belonging to the sport club, longer club memberships, or a strong commitment to the club. It is also influenced by club-specific structural conditions; volunteering is more probable in rural sports clubs whereas growth-oriented goals in clubs have a destabilising effect. Concerning decision-making processes an in-depth analysis of recruitment practices for volunteers was conducted in nine selected sport clubs (case study design) based on the garbage can model. Results show that the decision-making processes are generally characterised by a reactive approach in which dominant actors try to handle personnel problems of recruitment in the administration and sport domains through routine formal committee work and informal networks. In addition, it proved possible to develop a typology that deliver an overview of different decision-making practices in terms of the specific interplay of the relevant components of process control (top-down vs. bottom-up) and problem processing (situational vs. systematic). Based on the findings some recommendations for volunteer management in sport clubs are worked out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective strategies for recruiting volunteers who are prepared to make a long-term commitment to formal positions are essential for the survival of voluntary sport clubs. This article examines the decision-making processes in relation to these efforts. Under the assumption of bounded rationality, the garbage can model is used to grasp these decision-making processes theoretically and access them empirically. Based on case study framework an in-depth analysis of recruitment practices was conducted in nine selected sport clubs. Results showed that the decision-making processes are generally characterized by a reactive approach in which dominant actors try to handle personnel problems of recruitment in the administration and sport domains through routine formal committee work and informal networks. In addition, it proved possible to develop a typology that deliver an overview of different decision-making practices in terms of the specific interplay of the relevant components of process control (top-down vs. bottom-up) and problem processing (situational vs. systematic).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the number of space debris is increasing in the geostationary ring, it becomes mandatory for any satellite operator to avoid any collisions. Space debris in geosynchronous orbits may be observed with optical telescopes. Other than radar, that requires very large dishes and transmission powers for sensing high-altitude objects, optical observations do not depend on active illumination from ground and may be performed with notably smaller apertures. The detection size of an object depends on the aperture of the telescope, sky background and exposure time. With a telescope of 50 cm aperture, objects down to approximately 50 cm may be observed. This size is regarded as a threshold for the identification of hazardous objects and the prevention of potentially catastrophic collisions in geostationary orbits. In collaboration with the Astronomical Institute of the University of Bern (AIUB), the German Space Operations Center (GSOC) is building a small aperture telescope to demonstrate the feasibility of optical surveillance of the geostationary ring. The telescope will be located in the southern hemisphere and complement an existing telescope in the northern hemisphere already operated by AIUB. These two telescopes provide an optimum coverage of European GEO satellites and enable a continuous monitoring independent of seasonal limitations. The telescope will be operated completely automatically. The automated operations should be demonstrated covering the full range of activities including scheduling of observations, telescope and camera control as well as data processing.