913 resultados para Monitoring, SLA, JBoss, Middleware, J2EE, Java, Service Level Agreements
Resumo:
Data from 59 farms with complaints of udder health problems and insufficient quality of delivered milk that had been assessed by the Swiss Bovine Health Service (BHS) between 1999 and 2004 were retrospectively analysed. Data evaluated included farm characteristics such as farm size, herd size, average milk yield, milking system and housing system, deficits of the milking equipment and the milking practices, and bacteriological results of milk samples from all cows in lactation. The average size of the farms assessed by the BHS was larger than the size of the were evaluated, 42 showed obvious failures which the farm managers could have noticed. Only 5 of the 57 milkers carried out their work according to the generally valid guidelines of the National Mastitis Council. More than 2 basic mistakes were observed in the milking practices of 36 milkers. In 51 farms, mixed infections with several problem bacteria (those present in at least 20 % of the tested cows on a farm) were found. Staphylococcus aureus proved to be the most common problem germ. As the bacteria responsible for the herd problem (the sole problem bacteria detectable on a particular farm) Staphylococcus aureus was detected in 4 farms. The current study revealed that education in the area of milking techniques and milking practices of farmers should be improved in order to reduce the incidence of udder health problems on herd level. Staphylococcus aureus is the most important problem bacteria involved in herds with udder health problems in Switzerland. Staphylococcus aureus might be used in practice as the indicator germ for early recognition of management problems in dairy farms.
Resumo:
PURPOSE: The aim of this study was to evaluate the 3-year success rates of wide-body implants with a regular- or wide-neck configuration and a sandblasted, large grit, acid-etched (SLA) surface. MATERIALS AND METHODS: A total of 151 implants were consecutively placed in posterior sites of 116 partially edentulous patients in a referral clinic at the School of Dental Medicine, University of Bern. All implants were restored with cemented crowns or fixed partial dentures after a healing period of 6 to 8 weeks (for implants placed without simultaneous bone augmentation) or 10 to 14 weeks (for implants with simultaneous bone augmentation). All patients were recalled 36 months following implant placement for a clinical and radiographic examination. RESULTS: One implant failed to integrate during healing, and 11 implants were lost to follow-up and considered dropouts. The remaining 139 implants showed favorable clinical and radiographic findings and were considered successfully integrated at the 3-year examination. This resulted in a 3-year success rate of 99.3%. Radiographic evaluation of 134 implants indicated stability of the crestal bone levels: During the study period, the crestal bone level changed less than 0.5 mm for 129 implants. CONCLUSION: Successful tissue integration was achieved with wide-body implants with a regular or a wide-neck configuration and an SLA surface with high predictability. This successful tissue integration was well maintained for up to 3 years of follow-up.
Resumo:
Multiparameter cerebral monitoring has been widely applied in traumatic brain injury to study posttraumatic pathophysiology and to manage head-injured patients (e.g., combining O(2) and pH sensors with cerebral microdialysis). Because a comprehensive approach towards understanding injury processes will also require functional measures, we have added electrophysiology to these monitoring modalities by attaching a recording electrode to the microdialysis probe. These dual-function (microdialysis/electrophysiology) probes were placed in rats following experimental fluid percussion brain injuries, and in a series of severely head-injured human patients. Electrical activity (cell firing, EEG) was monitored concurrently with microdialysis sampling of extracellular glutamate, glucose and lactate. Electrophysiological parameters (firing rate, serial correlation, field potential occurrences) were analyzed offline and compared to dialysate concentrations. In rats, these probes demonstrated an injury-induced suppression of neuronal firing (from a control level of 2.87 to 0.41 spikes/sec postinjury), which was associated with increases in extracellular glutamate and lactate, and decreases in glucose levels. When placed in human patients, the probes detected sparse and slowly firing cells (mean = 0.21 spike/sec), with most units (70%) exhibiting a lack of serial correlation in the spike train. In some patients, spontaneous field potentials were observed, suggesting synchronously firing neuronal populations. In both the experimental and clinical application, the addition of the recording electrode did not appreciably affect the performance of the microdialysis probe. The results suggest that this technique provides a functional monitoring capability which cannot be obtained when electrophysiology is measured with surface or epidural EEG alone.
Resumo:
OBJECT: Disturbed ionic and neurotransmitter homeostasis are now recognized as probably the most important mechanisms contributing to the development of secondary brain swelling after traumatic brain injury (TBI). Evidence obtained in animal models indicates that posttraumatic neuronal excitation by excitatory amino acids leads to an increase in extracellular potassium, probably due to ion channel activation. The purpose of this study was therefore to measure dialysate potassium in severely head injured patients and to correlate these results with measurements of intracranial pressure (ICP), patient outcome, and levels of dialysate glutamate and lactate, and cerebral blood flow (CBF) to determine the role of ischemia in this posttraumatic ion dysfunction. METHODS: Eighty-five patients with severe TBI (Glasgow Coma Scale Score < 8) were treated according to an intensive ICP management-focused protocol. All patients underwent intracerebral microdialyis. Dialysate potassium levels were analyzed using flame photometry, and dialysate glutamate and dialysate lactate levels were measured using high-performance liquid chromatography and an enzyme-linked amperometric method in 72 and 84 patients, respectively. Cerebral blood flow studies (stable xenon computerized tomography scanning) were performed in 59 patients. In approximately 20% of the patients, dialysate potassium values were increased (dialysate potassium > 1.8 mM) for 3 hours or more. A mean amount of dialysate potassium greater than 2 mM throughout the entire monitoring period was associated with ICP above 30 mm Hg and fatal outcome, as were progressively rising levels of dialysate potassium. The presence of dialysate potassium correlated positively with dialysate glutamate (p < 0.0001) and lactate (p < 0.0001) levels. Dialysate potassium was significantly inversely correlated with reduced CBF (p = 0.019). CONCLUSIONS: Dialysate potassium was increased after TBI in 20% of measurements. High levels of dialysate potassium were associated with increased ICP and poor outcome. The simultaneous increase in dialysate potassium, together with dialysate glutamate and lactate, supports the concept that glutamate induces ionic flux and consequently increases ICP, which the authors speculate may be due to astrocytic swelling. Reduced CBF was also significantly correlated with increased levels of dialysate potassium. This may be due to either cell swelling or altered vasoreactivity in cerebral blood vessels caused by higher levels of potassium after trauma. Additional studies in which potassium-sensitive microelectrodes are used are needed to validate these ionic events more clearly.
Resumo:
The amount and type of ground cover is an important characteristic to measure when collecting soil disturbance monitoring data after a timber harvest. Estimates of ground cover and bare soil can be used for tracking changes in invasive species, plant growth and regeneration, woody debris loadings, and the risk of surface water runoff and soil erosion. A new method of assessing ground cover and soil disturbance was recently published by the U.S. Forest Service, the Forest Soil Disturbance Monitoring Protocol (FSDMP). This protocol uses the frequency of cover types in small circular (15cm) plots to compare ground surface in pre- and post-harvest condition. While both frequency and percent cover are common methods of describing vegetation, frequency has rarely been used to measure ground surface cover. In this study, three methods for assessing ground cover percent (step-point, 15cm dia. circular and 1x5m visual plot estimates) were compared to the FSDMP frequency method. Results show that the FSDMP method provides significantly higher estimates of ground surface condition for most soil cover types, except coarse wood. The three cover methods had similar estimates for most cover values. The FSDMP method also produced the highest value when bare soil estimates were used to model erosion risk. In a person-hour analysis, estimating ground cover percent in 15cm dia. plots required the least sampling time, and provided standard errors similar to the other cover estimates even at low sampling intensities (n=18). If ground cover estimates are desired in soil monitoring, then a small plot size (15cm dia. circle), or a step-point method can provide a more accurate estimate in less time than the current FSDMP method.
Resumo:
The U.S. Renewable Fuel Standard mandates that by 2022, 36 billion gallons of renewable fuels must be produced on a yearly basis. Ethanol production is capped at 15 billion gallons, meaning 21 billion gallons must come from different alternative fuel sources. A viable alternative to reach the remainder of this mandate is iso-butanol. Unlike ethanol, iso-butanol does not phase separate when mixed with water, meaning it can be transported using traditional pipeline methods. Iso-butanol also has a lower oxygen content by mass, meaning it can displace more petroleum while maintaining the same oxygen concentration in the fuel blend. This research focused on studying the effects of low level alcohol fuels on marine engine emissions to assess the possibility of using iso-butanol as a replacement for ethanol. Three marine engines were used in this study, representing a wide range of what is currently in service in the United States. Two four-stroke engine and one two-stroke engine powered boats were tested in the tributaries of the Chesapeake Bay, near Annapolis, Maryland over the course of two rounds of weeklong testing in May and September. The engines were tested using a standard test cycle and emissions were sampled using constant volume sampling techniques. Specific emissions for two-stroke and four-stroke engines were compared to the baseline indolene tests. Because of the nature of the field testing, limited engine parameters were recorded. Therefore, the engine parameters analyzed aside from emissions were the operating relative air-to-fuel ratio and engine speed. Emissions trends from the baseline test to each alcohol fuel for the four-stroke engines were consistent, when analyzing a single round of testing. The same trends were not consistent when comparing separate rounds because of uncontrolled weather conditions and because the four-stroke engines operate without fuel control feedback during full load conditions. Emissions trends from the baseline test to each alcohol fuel for the two-stroke engine were consistent for all rounds of testing. This is due to the fact the engine operates open-loop, and does not provide fueling compensation when fuel composition changes. Changes in emissions with respect to the baseline for iso-butanol were consistent with changes for ethanol. It was determined iso-butanol would make a viable replacement for ethanol.
Resumo:
Water springs are the principal source of water for many localities in Central America, including the municipality of Concepción Chiquirichapa in the Western Highlands of Guatemala. Long-term monitoring records are critical for informed water management as well as resource forecasting, though data are scarce and monitoring in low-resource settings presents special challenges. Spring discharge was monitored monthly in six municipal springs during the author’s Peace Corps assignment, from May 2011 to March 2012, and water level height was monitored in two spring boxes over the same time period using automated water-level loggers. The intention of this approach was to circumvent the need for frequent and time-intensive manual measurement by identifying a fixed relationship between discharge and water level. No such relationship was identified, but the water level record reveals that spring yield increased for four months following Tropical Depression 12E in October 2011. This suggests that the relationship between extreme precipitation events and long-term water spring yields in Concepción should be examined further. These limited discharge data also indicate that aquifer baseflow recession and catchment water balance could be successfully characterized if a long-term discharge record were established. This study also presents technical and social considerations for selecting a methodology for spring discharge measurement and highlights the importance of local interest in conducting successful community-based research in intercultural low-resource settings.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
Virtual machines (VMs) emulating hardware devices are generally implemented in low-level languages for performance reasons. This results in unmaintainable systems that are difficult to understand. In this paper we report on our experience using the PyPy toolchain to improve the portability and reduce the complexity of whole-system VM implementations. As a case study we implement a VM prototype for a Nintendo Game Boy, called PyGirl, in which the high-level model is separated from low-level VM implementation issues. We shed light on the process of refactoring from a low-level VM implementation in Java to a high-level model in RPython. We show that our whole-system VM written with PyPy is significantly less complex than standard implementations, without substantial loss in performance.
Resumo:
This paper is focused on the integration of state-of-the-art technologies in the fields of telecommunications, simulation algorithms, and data mining in order to develop a Type 1 diabetes patient's semi to fully-automated monitoring and management system. The main components of the system are a glucose measurement device, an insulin delivery system (insulin injection or insulin pumps), a mobile phone for the GPRS network, and a PDA or laptop for the Internet. In the medical environment, appropriate infrastructure for storage, analysis and visualizing of patients' data has been implemented to facilitate treatment design by health care experts.
Resumo:
Type 1 diabetes mellitus is a chronic disease characterized by blood glucose levels out of normal range due to inability of insulin production. This dysfunction leads to many short- and long-term complications. In this paper, a system for tele-monitoring and tele-management of Type 1 diabetes patients is proposed, aiming at reducing the risk of diabetes complications and improving quality of life. The system integrates Wireless Personal Area Networks (WPAN), mobile infrastructure, and Internet technology along with commercially available and novel glucose measurement devices, advanced modeling techniques, and tools for the intelligent processing of the available diabetes patients information. The integration of the above technologies enables intensive monitoring of blood glucose levels, treatment optimisation, continuous medical care, and improvement of quality of life for Type 1 diabetes patients, without restrictions in everyday life activities.
Resumo:
Bovine spongiform encephalopathy (BSE) rapid tests and routine BSE-testing laboratories underlie strict regulations for approval. Due to the lack of BSE-positive control samples, however, full assay validation at the level of individual test runs and continuous monitoring of test performance on-site is difficult. Most rapid tests use synthetic prion protein peptides, but it is not known to which extend they reflect the assay performance on field samples, and whether they are sufficient to indicate on-site assay quality problems. To address this question we compared the test scores of the provided kit peptide controls to those of standardized weak BSE-positive tissue samples in individual test runs as well as continuously over time by quality control charts in two widely used BSE rapid tests. Our results reveal only a weak correlation between the weak positive tissue control and the peptide control scores. We identified kit-lot related shifts in the assay performances that were not reflected by the peptide control scores. Vice versa, not all shifts indicated by the peptide control scores indeed reflected a shift in the assay performance. In conclusion these data highlight that the use of the kit peptide controls for continuous quality control purposes may result in unjustified rejection or acceptance of test runs. However, standardized weak positive tissue controls in combination with Shewhart-CUSUM control charts appear to be reliable in continuously monitoring assay performance on-site to identify undesired deviations.
Resumo:
Many Member States of the European Union (EU) currently monitor antimicrobial resistance in zoonotic agents, including Salmonella and Campylobacter. According to Directive 2003/99/EC, Member States shall ensure that the monitoring provides comparable data on the occurrence of antimicrobial resistance. The European Commission asked the European Food Safety Authority to prepare detailed specifications for harmonised schemes for monitoring antimicrobial resistance. The objective of these specifications is to lay down provisions for a monitoring and reporting scheme for Salmonella in fowl (Gallus gallus), turkeys and pigs, and for Campylobacter jejuni and Campylobacter coli in broiler chickens. The current specifications are considered to be a first step towards a gradual implementation of comprehensive antimicrobial resistance monitoring at the EU level. These specifications propose to test a common set of antimicrobial agents against available cut-off values and a specified concentration range to determine the susceptibility of Salmonella and Campylobacter. Using isolates collected through programmes in which the sampling frame covers all epidemiological units of the national production, the target number of Salmonella isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., laying hens, broilers, turkeys and slaughter pigs). The target number of Campylobacter isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., broilers). The results of the antimicrobial resistance monitoring are assessed and reported in the yearly national report on trends and sources of zoonoses, zoonotic agents and antimicrobial resistance.
Resumo:
The concept of platform switching has been introduced to implant dentistry based on clinical observations of reduced peri-implant crestal bone loss. However, published data are controversial, and most studies are limited to 12 months. The aim of the present randomized clinical trial was to test the hypothesis that platform switching has a positive impact on crestal bone-level changes after 3 years. Two implants with a diameter of 4 mm were inserted crestally in the posterior mandible of 25 patients. The intraindividual allocation of platform switching (3.3-mm platform) and the standard implant (4-mm platform) was randomized. After 3 months of submerged healing, single-tooth crowns were cemented. Patients were followed up at short intervals for monitoring of healing and oral hygiene. Statistical analysis for the influence of time and platform type on bone levels employed the Brunner-Langer model. At 3 years, the mean radiographic peri-implant bone loss was 0.69 ± 0.43 mm (platform switching) and 0.74 ± 0.57 mm (standard platform). The mean intraindividual difference was 0.05 ± 0.58 mm (95% confidence interval: -0.19, 0.29). Crestal bone-level alteration depended on time (p < .001) but not on platform type (p = .363). The present randomized clinical trial could not confirm the hypothesis of a reduced peri-implant crestal bone loss, when implants had been restored according to the concept of platform switching.