930 resultados para Level Control.
Resumo:
In sport events like Olympic Games or World Championships competitive athletes keep pushing the boundaries of human performance. Compared to team sports, high achievements in many athletic disciplines depend solely on the individual's performance. Contrasting previous research looking for expertise-related differences in brain anatomy at the group level, we aim to demonstrate changes in individual top athlete's brain, which would be averaged out in a group analysis. We compared structural magnetic resonance images (MRI) of three professional track-and-field athletes to age-, gender- and education-matched control subjects. To determine brain features specific to these top athletes, we tested for significant deviations in structural grey matter density between each of the three top athletes and a carefully matched control sample. While total brain volumes were comparable between athletes and controls, we show regional grey matter differences in striatum and thalamus. The demonstrated brain anatomy patterns remained stable and were detected after 2 years with Olympic Games in between. We also found differences in the fusiform gyrus in two top long jumpers. We interpret our findings in reward-related areas as correlates of top athletes' persistency to reach top-level skill performance over years.
Resumo:
Background: Vancomycin is a cornerstone antibiotic for the management of severe Gram positive infections. However, high doses of vancomycin are associated with a risk of nephrotoxicity. This study aimed to evaluate the relationship between the evolution of vancomycin trough concentration and the occurrence of nephrotoxicity, and to identify risk factors for both vancomycin-associated nephrotoxicity and vancomycin overexposure. Methods: A total of 1240 patients records from our hospital therapeutic drug monitoring database between 2007 and 2011 were screened and grouped according to predefined criteria defining vancomycin overexposure (one or more occurrence of a trough level ≥ 20 mg/L) and treatment-related nephrotoxicity (rise of serum creatinine by ≥ 50% over baseline). A representative sample of 150 cases was selected for in depth analysis. Weighted logistic regression analyses were used to test associations between vancomycin overexposure, nephrotoxicity and other predictors of interest. Results: Patients with high trough concentrations were found to be more likely to develop nephrotoxicity (odds ratio: 4.12; p <0.001). Specific risk factors, notably concomitant nephrotoxic treatments and comorbid conditions (heart failure), were found to independently increase the risk of either nephrotoxicity or vancomycin exposure. Finally, the exploration of temporal relationships between variations of vancomycin trough concentrations and creatinine levels were in line with circular causality with some antecedence of vancomycin on creatinine changes. Conclusion: Our results confirm the important nephrotoxic potential of vancomycin and indicate that the utilisation of this drug deserves thorough individualization for conditions susceptible to increase its concentration exposure and reactive adjustment based on therapeutic drug monitoring.
Resumo:
Tutkimus tarkastelee taloudellisia mallintamismahdollisuuksia metsäteollisuuden liiketoimintayksikössä. Tavoitteena on suunnitella ja luoda taloudellinen malli liiketoimintayksikölle, jonka avulla sen tuloksen analysoiminen ja ennustaminen on mahdollista. Tutkimusta tarkastellaan konstruktiivisen tutkimusmenetelmän avulla. Teoreettinen viitekehys tarkastelee olemassa olevan informaation muotoilemista keskittyen tiedon jalostamisen tarpeisiin, päätöksenteon asettamiin vaatimuksiin sekä mallintamiseen. Toiseksi, teoria esittää informaatiolle asetettavia vaatimuksia organisatorisen ohjauksen näkökulmasta.Empiirinen tieto kerätään osallistuvan havainnoinnin avulla hyödyntäen epävirallisia keskusteluja, tietojärjestelmiä ja laskentatoimen dokumentteja. Tulokset osoittavat, että liikevoiton ennustaminen mallin avulla on vaikeaa, koska taustalla vaikuttavien muuttujien määrä on suuri. Tästä johtuen malli täytyykin rakentaa niin, että se tarkastelee liikevoittoa niin yksityiskohtaisella tasolla kuin mahdollista. Testauksessa mallin tarkkuus osoittautui sitä paremmaksi, mitä tarkemmalla tasolla ennustaminen tapahtui. Lisäksi testaus osoitti, että malli on käyttökelpoinen liiketoiminnan ohjauksessa lyhyellä aikavälillä. Näin se luo myös pohjan pitkän aikavälin ennustamiselle.
Resumo:
Tämä diplomityö käsittelee sääntöpohjaisen verkkoon pääsyn hallinnan (NAC) ratkaisuja arkkitehtonisesta näkökulmasta. Työssä käydään läpi Trusted Computing Groupin, Microsoft Corporationin, Juniper Networksin sekä Cisco Systemsin NAC-ratkaisuja. NAC koostuu joukosta uusia sekä jo olemassa olevia teknologioita, jotka auttavat ennalta määriteltyyn sääntökantaan perustuen hallitsemaan suojattuun verkkoon pyrkivien laitteiden tietoliikenneyhteyksiä. Käyttäjän tunnistamisen lisäksi NAC pystyy rajoittamaan verkkoon pääsyä laitekohtaisten ominaisuuksien perusteella, esimerkiksi virustunnisteisiin ja käyttöjärjestelmäpäivityksiin liittyen ja paikkaamaan tietyin rajoituksin näissä esiintyviä puutteita verkkoon pääsyn sallimiseksi. NAC on verraten uusi käsite, jolta puuttuu tarkka määritelmä. Tästä johtuen nykymarkkinoilla myydään ominaisuuksiltaan puutteellisia tuotteita NAC-nimikkeellä. Standardointi eri valmistajien NAC-komponenttien yhteentoimivuuden takaamiseksi on meneillään, minkä perusteella ratkaisut voidaan jakaa joko avoimia standardeja tai valmistajakohtaisia standardeja noudattaviksi. Esitellyt NAC-ratkaisut noudattavat standardeja joko rajoitetusti tai eivät lainkaan. Mikään läpikäydyistä ratkaisuista ei ole täydellinen NAC, mutta Juniper Networksin ratkaisu nousee niistä potentiaalisimmaksi jatkokehityksen ja -tutkimuksen kohteeksi TietoEnator Processing & Networks Oy:lle. Eräs keskeinen ongelma NAC-konseptissa on työaseman tietoverkolle toimittama mahdollisesti valheellinen tietoturvatarkistuksen tulos, minkä perusteella pääsyä osittain hallitaan. Muun muassa tähän ongelmaan ratkaisuna voisi olla jo nykytietokoneista löytyvä TPM-siru, mikä takaa tiedon oikeellisuuden ja koskemattomuuden.
Resumo:
Efficiency in the administration of justice is found to increase over time, while the variation in the efficiency of the courts tended to remain low and fall over time. This would appear to be good news, at least for the case studied here: the civil courts of first instance in Spain between 2005 and 2009. Apart from the simple passing of time, the percentage of temporary judges in the system also determines some of the differences in efficiency between courts over time. Thus, we find that the greater the percentage of temporary judges, the lower is the efficiency of the courts. Overall, the average relative efficiency level for the period 2005 to 2009 was 97.46%, suggesting the matter is under control.
Resumo:
The objective of my thesis is to assess mechanisms of ecological community control in macroalgal communities in the Baltic Sea. In the top-down model, predatory fish feed on invertebrate mesograzers, releasing algae partly from grazing pressure. Such a reciprocal relationship is called trophic cascade. In the bottom-up model, nutrients increase biomass in the food chain. The nutrients are first assimilated by algae and, via food chain, increase also abundance of grazers and predators. Previous studies on oceanic shores have described these two regulative mechanisms in the grazer - alga link, but how they interact in the trophic cascades from fish to algae is still inadequately known. Because the top-down and bottom-up mechanisms are predicted to depend on environmental disturbances, such as wave stress and light, I have studied these models at two distinct water depths. There are five factorial field experiments behind the thesis, which were all conducted in the Finnish Archipelago Sea. In all the experiments, I studied macroalgal colonization - either density, filament length or biomass - on submerged colonization substrates. By excluding predatory fish and mesograzers from the algal communities, the studies compared the strength of the top-down control to natural algal communities. A part of the experimental units were, in addition, exposed to enriched nitrogen and phosphorus concentrations, which enabled testing of bottom-up control. These two models of community control were further investigated in shallow (<1 m) and deep (ca. 3 m) water. Moreover, the control mechanisms were also expected to depend on grazer species. Therefore different grazer species were enclosed into experimental units and their impacts on macroalgal communities were followed specifically. The community control in the Baltic rocky shores was found to follow theoretical predictions, which have not been confirmed by field studies before. Predatory fish limited grazing impact, which was seen as denser algal communities and longer algal filaments. Nutrient enrichment increased density and filament length of annual algae and, thus, changed the species composition of the algal community. The perennial alga Fucus vesiculosusA and the red alga Ceramium tenuicorne suffered from the increased nutrient availabilities. The enriched nutrient conditions led to denser grazer fauna, thereby causing strong top-down control over both the annual and perennial macroalgae. The strength of the top-down control seemed to depend on the density and diversity of grazers and predators as well as on the species composition of macroalgal assemblages. The nutrient enrichment led to, however, weaker limiting impact of predatory fish on grazer fauna, because fish stocks did not respond as quickly to enhanced resources in the environment as the invertebrate fauna. According to environmental stress model, environmental disturbances weaken the top-down control. For example, on a wave-exposed shore, wave stress causes more stress to animals close to the surface than deeper on the shore. Mesograzers were efficient consumers at both the depths, while predation by fish was weaker in shallow water. Thus, the results supported the environmental stress model, which predicts that environmental disturbance affects stronger the higher a species is in the food chain. This thesis assessed the mechanisms of community control in three-level food chains and did not take into account higher predators. Such predators in the Baltic Sea are, for example, cormorant, seals, white-tailed sea eagle, cod and salmon. All these predatory species were recently or are currently under intensive fishing, hunting and persecution, and their stocks have only recently increased in the region. Therefore, it is possible that future densities of top predators may yet alter the strengths of the controlling mechanisms in the Baltic littoral zone.
Resumo:
The fight against doping in sports has been governed since 1999 by the World Anti-Doping Agency (WADA), an independent institution behind the implementation of the World Anti-Doping Code (Code). The intent of the Code is to protect clean athletes through the harmonization of anti-doping programs at the international level with special attention to detection, deterrence and prevention of doping.1 A new version of the Code came into force on January 1st 2015, introducing, among other improvements, longer periods of sanctioning for athletes (up to four years) and measures to strengthen the role of anti-doping investigations and intelligence. To ensure optimal harmonization, five International Standards covering different technical aspects of the Code are also currently in force: the List of Prohibited Substances and Methods (List), Testing and Investigations, Laboratories, Therapeutic Use Exemptions (TUE) and Protection of Privacy and Personal Information. Adherence to these standards is mandatory for all anti-doping stakeholders to be compliant with the Code. Among these documents, the eighth version of International Standard for Laboratories (ISL), which also came into effect on January 1st 2015, includes regulations for WADA and ISO/IEC 17025 accreditations and their application for urine and blood sample analysis by anti-doping laboratories.2 Specific requirements are also described in several Technical Documents or Guidelines in which various topics are highlighted such as the identification criteria for gas chromatography (GC) and liquid chromatography (LC) coupled to mass spectrometry (MS) techniques (IDCR), measurements and reporting of endogenous androgenic anabolic agents (EAAS) and analytical requirements for the Athlete Biological Passport (ABP).
Resumo:
This study analyzed high-density event-related potentials (ERPs) within an electrical neuroimaging framework to provide insights regarding the interaction between multisensory processes and stimulus probabilities. Specifically, we identified the spatiotemporal brain mechanisms by which the proportion of temporally congruent and task-irrelevant auditory information influences stimulus processing during a visual duration discrimination task. The spatial position (top/bottom) of the visual stimulus was indicative of how frequently the visual and auditory stimuli would be congruent in their duration (i.e., context of congruence). Stronger influences of irrelevant sound were observed when contexts associated with a high proportion of auditory-visual congruence repeated and also when contexts associated with a low proportion of congruence switched. Context of congruence and context transition resulted in weaker brain responses at 228 to 257 ms poststimulus to conditions giving rise to larger behavioral cross-modal interactions. Importantly, a control oddball task revealed that both congruent and incongruent audiovisual stimuli triggered equivalent non-linear multisensory interactions when congruence was not a relevant dimension. Collectively, these results are well explained by statistical learning, which links a particular context (here: a spatial location) with a certain level of top-down attentional control that further modulates cross-modal interactions based on whether a particular context repeated or changed. The current findings shed new light on the importance of context-based control over multisensory processing, whose influences multiplex across finer and broader time scales.
Resumo:
INTRODUCTION: Hyperglycemia is a metabolic alteration in major burn patients associated with complications. The study aimed at evaluating the safety of general ICU glucose control protocols applied in major burns receiving prolonged ICU treatment. METHODS: 15year retrospective analysis of consecutive, adult burn patients admitted to a single specialized centre. EXCLUSION CRITERIA: death or length of stay <10 days, age <16years. VARIABLES: demographic variables, burned surface (TBSA), severity scores, infections, ICU stay, outcome. Metabolic variables: total energy, carbohydrate and insulin delivery/24h, arterial blood glucose and CRP values. Analysis of 4 periods: 1, before protocol; 2, tight doctor driven; 3, tight nurse driven; 4, moderate nurse driven. RESULTS: 229 patients, aged 45±20 years (mean±SD), burned 32±20% TBSA were analyzed. SAPSII was 35±13. TBSA, Ryan and ABSI remained stable. Inhalation injury increased. A total of 28,690 blood glucose samples were analyzed: the median value remained unchanged with a narrower distribution over time. After the protocol initiation, the normoglycemic values increased from 34.7% to 65.9%, with a reduction of hypoglycaemic events (no extreme hypoglycemia in period 4). Severe hyperglycemia persisted throughout with a decrease in period 4 (9.25% in period 4). Energy and glucose deliveries decreased in periods 3 and 4 (p<0.0001). Infectious complications increased during the last 2 periods (p=0.01). CONCLUSION: A standardized ICU glucose control protocol improved the glycemic control in adult burn patients, reducing glucose variability. Moderate glycemic control in burns was safe specifically related to hypoglycemia, reducing the incidence of hypoglycaemic events compared to the period before. Hyperglycemia persisted at a lower level.
Resumo:
Plants synthesize a myriad of isoprenoid products that are required both for essential constitutive processes and for adaptive responses to the environment. The enzyme 3-hydroxy-3-methylglutaryl-CoA reductase (HMGR) catalyzes a key regulatory step of the mevalonate pathway for isoprenoid biosynthesis and is modulated by many endogenous and external stimuli. In spite of that, no protein factor interacting with and regulating plant HMGR in vivo has been described so far. Here, we report the identification of two B99 regulatory subunits of protein phosphatase 2A (PP2A), designated B99a and B99b, that interact with HMGR1S and HMGR1L, the major isoforms of Arabidopsis thaliana HMGR. B99a and B99b are Ca2+ binding proteins of the EF-hand type. We show that HMGR transcript, protein, and activity levels are modulated by PP2A in Arabidopsis. When seedlings are transferred to salt-containing medium, B99a and PP2A mediate the decrease and subsequent increase of HMGR activity, which results from a steady rise of HMGR1-encoding transcript levels and an initial sharper reduction of HMGR protein level. In unchallenged plants, PP2A is a posttranslational negative regulator of HMGR activity with the participation of B99b. Our data indicate that PP2A exerts multilevel control on HMGR through the fivemember B99 protein family during normal development and in response to a variety of stress conditions.
Resumo:
Behavior-based navigation of autonomous vehicles requires the recognition of the navigable areas and the potential obstacles. In this paper we describe a model-based objects recognition system which is part of an image interpretation system intended to assist the navigation of autonomous vehicles that operate in industrial environments. The recognition system integrates color, shape and texture information together with the location of the vanishing point. The recognition process starts from some prior scene knowledge, that is, a generic model of the expected scene and the potential objects. The recognition system constitutes an approach where different low-level vision techniques extract a multitude of image descriptors which are then analyzed using a rule-based reasoning system to interpret the image content. This system has been implemented using a rule-based cooperative expert system
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
We present an overview of the long-term adaptation of hippocampal neurotransmission to cholinergic and GABAergic deafferentation caused by excitotoxic lesion of the medial septum. Two months after septal microinjection of 2.7 nmol a -amino-3-hydroxy-5-methylisoxazole-4-propionate (AMPA), a 220% increase of GABA A receptor labelling in the hippo- campal CA3 and the hilus was shown, and also changes in hippocampal neurotransmission characterised by in vivo microdialysis and HPLC. Basal amino acid and purine extra- cellular levels were studied in control and lesioned rats. In vivo effects of 100 m M KCl perfusion and adenosine A 1 receptor blockade with 1,3-dipropyl- 8-cyclopentylxanthine (DPCPX) on their release were also investigated. In lesioned animals GABA, glutamate and glutamine basal levels were decreased and taurine, adenosine and uric acid levels increased. A similar response to KCl infusion occurred in both groups except for GABA and glutamate, which release decreased in lesioned rats. Only in lesioned rats, DPCPX increased GABA basal level and KCl-induced glutamate release, and decreased glutamate turnover. Our results evidence that an excitotoxic septal lesion leads to increased hippocampal GABA A receptors and decreased glutamate neurotransmis- sion. In this situation, a co-ordinated response of hippocampal retaliatory systems takes place to control neuron excitability.
Resumo:
The productivity, quality and cost efficiency of welding work are critical for metal industry today. Welding processes must get more effective and this can be done by mechanization and automation. Those systems are always expensive and they have to pay the investment back. In this case it is really important to optimize the needed intelligence and this way needed automation level, so that a company will get the best profit. This intelligence and automation level was earlier classified in several different ways which are not useful for optimizing the process of automation or mechanization of welding. In this study the intelligence of a welding system is defined in a new way to enable the welding system to produce a weld good enough. In this study a new way is developed to classify and select the internal intelligence level of a welding system needed to produce the weld efficiently. This classification contains the possible need of human work and its effect to the weld and its quality but does not exclude any different welding processes or methods. In this study a totally new way is developed to calculate the best optimization for the needed intelligence level in welding. The target of this optimization is the best possible productivity and quality and still an economically optimized solution for several different cases. This new optimizing method is based on grounds of product type, economical productivity, the batch size of products, quality and criteria of usage. Intelligence classification and optimization were never earlier made by grounds of a made product. Now it is possible to find the best type of welding system needed to welddifferent types of products. This calculation process is a universal way for optimizing needed automation or mechanization level when improving productivity of welding. This study helps the industry to improve productivity, quality and cost efficiency of welding workshops.
Resumo:
Background: Assessing of the costs of treating disease is necessary to demonstrate cost-effectiveness and to estimate the budget impact of new interventions and therapeutic innovations. However, there are few comprehensive studies on resource use and costs associated with lung cancer patients in clinical practice in Spain or internationally. The aim of this paper was to assess the hospital cost associated with lung cancer diagnosis and treatment by histology, type of cost and stage at diagnosis in the Spanish National Health Service. Methods: A retrospective, descriptive analysis on resource use and a direct medical cost analysis were performed. Resource utilisation data were collected by means of patient files from nine teaching hospitals. From a hospital budget impact perspective, the aggregate and mean costs per patient were calculated over the first three years following diagnosis or up to death. Both aggregate and mean costs per patient were analysed by histology, stage at diagnosis and cost type. Results: A total of 232 cases of lung cancer were analysed, of which 74.1% corresponded to non-small cell lung cancer (NSCLC) and 11.2% to small cell lung cancer (SCLC); 14.7% had no cytohistologic confirmation. The mean cost per patient in NSCLC ranged from 13,218 Euros in Stage III to 16,120 Euros in Stage II. The main cost components were chemotherapy (29.5%) and surgery (22.8%). Advanced disease stages were associated with a decrease in the relative weight of surgical and inpatient care costs but an increase in chemotherapy costs. In SCLC patients, the mean cost per patient was 15,418 Euros for limited disease and 12,482 Euros for extensive disease. The main cost components were chemotherapy (36.1%) and other inpatient costs (28.7%). In both groups, the Kruskall-Wallis test did not show statistically significant differences in mean cost per patient between stages. Conclusions: This study provides the costs of lung cancer treatment based on patient file reviews, with chemotherapy and surgery accounting for the major components of costs. This cost analysis is a baseline study that will provide a useful source of information for future studies on cost-effectiveness and on the budget impact of different therapeutic innovations in Spain.