830 resultados para Interface algorithms
Resumo:
The variability observed in drug exposure has a direct impact on the overall response to drug. The largest part of variability between dose and drug response resides in the pharmacokinetic phase, i.e. in the dose-concentration relationship. Among possibilities offered to clinicians, Therapeutic Drug Monitoring (TDM; Monitoring of drug concentration measurements) is one of the useful tool to guide pharmacotherapy. TDM aims at optimizing treatments by individualizing dosage regimens based on blood drug concentration measurement. Bayesian calculations, relying on population pharmacokinetic approach, currently represent the gold standard TDM strategy. However, it requires expertise and computational assistance, thus limiting its large implementation in routine patient care. The overall objective of this thesis was to implement robust tools to provide Bayesian TDM to clinician in modern routine patient care. To that endeavour, aims were (i) to elaborate an efficient and ergonomic computer tool for Bayesian TDM: EzeCHieL (ii) to provide algorithms for drug concentration Bayesian forecasting and software validation, relying on population pharmacokinetics (iii) to address some relevant issues encountered in clinical practice with a focus on neonates and drug adherence. First, the current stage of the existing software was reviewed and allows establishing specifications for the development of EzeCHieL. Then, in close collaboration with software engineers a fully integrated software, EzeCHieL, has been elaborated. EzeCHieL provides population-based predictions and Bayesian forecasting and an easy-to-use interface. It enables to assess the expectedness of an observed concentration in a patient compared to the whole population (via percentiles), to assess the suitability of the predicted concentration relative to the targeted concentration and to provide dosing adjustment. It allows thus a priori and a posteriori Bayesian drug dosing individualization. Implementation of Bayesian methods requires drug disposition characterisation and variability quantification trough population approach. Population pharmacokinetic analyses have been performed and Bayesian estimators have been provided for candidate drugs in population of interest: anti-infectious drugs administered to neonates (gentamicin and imipenem). Developed models were implemented in EzeCHieL and also served as validation tool in comparing EzeCHieL concentration predictions against predictions from the reference software (NONMEM®). Models used need to be adequate and reliable. For instance, extrapolation is not possible from adults or children to neonates. Therefore, this work proposes models for neonates based on the developmental pharmacokinetics concept. Patients' adherence is also an important concern for drug models development and for a successful outcome of the pharmacotherapy. A last study attempts to assess impact of routine patient adherence measurement on models definition and TDM interpretation. In conclusion, our results offer solutions to assist clinicians in interpreting blood drug concentrations and to improve the appropriateness of drug dosing in routine clinical practice.
Resumo:
Background: Nursing terminologies are designed to support nursing practice but, as with any other clinical tool, they should be evaluated. Cross-mapping is a formal method for examining the validity of the existing controlled vocabularies. Objectives: The study aims to assess the inclusiveness and expressiveness of the nursing diagnosis axis of a newly implemented interface terminology by cross-mapping with the NANDA-I taxonomy. Design/Methods: The study applied a descriptive design, using a cross-sectional, bidirectional mapping strategy. The sample included 728 concepts from both vocabularies. Concept cross-mapping was carried out to identify one-to-one, negative, and hierarchical connections. The analysis was conducted using descriptive statistics. Results: Agreement of the raters" mapping achieved 97%. More than 60% of the nursing diagnosis concepts in the NANDA-I taxonomy were mapped to concepts in the diagnosis axis of the new interface terminology; 71.1% were reversely mapped. Conclusions: Main results for outcome measures suggest that the diagnosis axis of this interface terminology meets the validity criterion of cross-mapping when mapped from and to the NANDA-I taxonomy.
Resumo:
Jatkuvasti lisääntyvä matkapuhelinten käyttäjien määrä, internetin kehittyminen yleiseksi tiedon ja viihteen lähteeksi on luonut tarpeen palvelulle liikkuvan työaseman liittämiseksi tietokoneverkkoihin. GPRS on uusi teknologia, joka tarjoaa olemassa olevia matka- puhelinverkkoja (esim. NMT ja GSM) nopeamman, tehokkaamman ja taloudellisemman liitynnän pakettidataverkkoihin, kuten internettiin ja intranetteihin. Tämän työn tavoitteena oli toteuttaa GPRS:n paketinohjausyksikön (Packet Control Unit, PCU) testauksessa tarvittavat viestintäajurit työasemaympristöön. Aidot matkapuhelinverkot ovat liian kalliita, eikä niistä saa tarvittavasti lokitulostuksia, jotta niitä voisi käyttää GPRS:n testauksessa ohjelmiston kehityksen alkuvaihessa. Tämän takia PCU-ohjelmiston testaus suoritetaan joustavammassa ja helpommin hallittavassa ympäristössä, joka ei aseta kovia reaaliaikavaatimuksia. Uusi toimintaympäristö ja yhteysmedia vaativat PCU:n ja muiden GPRS-verkon yksiköiden välisistä yhteyksistä huolehtivien ohjelman osien, viestintäajurien uuden toteutuksen. Tämän työn tuloksena syntyivät tarvittavien viestintäajurien työasemaversiot. Työssä tarkastellaan eri tiedonsiirtotapoja ja -protokollia testattavan ohjelmiston vaateiden, toteutetun ajurin ja testauksen kannalta. Työssä esitellään kunkin ajurin toteuttama rajapinta ja toteutuksen aste, eli mitkä toiminnot on toteutettu ja mitä on jätetty pois. Ajureiden rakenne ja toiminta selvitetään siltä osin, kuin se on oleellista ohjelman toiminnan kannalta.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
We present a brief résumé of the history of solidification research and key factors affecting the solidification of fusion welds. There is a general agreement of the basic solidification theory, albeit differing - even confusing - nomenclatures do exist, and Cases 2 and 3 (the Chalmers' basic boundary conditions for solidification, categorized by Savage as Cases) are variably emphasized. Model Frame, a tool helping to model the continuum of fusion weld solidification from start to end, is proposed. It incorporates the general solidification models, of which the pertinent ones are selected for the actual modeling. The basic models are the main solidification Cases 1…4. These discrete Cases are joined with Sub-Cases: models of Pfann, Flemings and others, bringing needed Sub-Case variables into the model. Model Frame depicts a grain growing from the weld interface to its centerline. Besides modeling, the Model Frame supports education and academic debate. The new mathematical modeling techniques will extend its use into multi-dimensional modeling, introducing new variables and increasing the modeling accuracy. We propose a model: melting/solidification-model (M/S-model) - predicting the solute profile at the start of the solidification of a fusion weld. This Case 3-based Sub-Case takes into account the melting stage, the solute back-diffusion in the solid, and the growth rate acceleration typical to fusion welds. We propose - based on works of Rutter & Chalmers, David & Vitek and our experimental results on copper - that NEGS-EGS-transition is not associated only with cellular-dendritic-transition. Solidification is studied experimentally on pure and doped copper with welding speed range from 0 to 200 cm/min, with one test at 3000 cm/min. Found were only planar and cellular structures, no dendrites - columnar or equiaxed. Cell sub structures: rows of cubic elements we call "cubelettes", "cell-bands" and "micro-cells", as well as an anomalous crack morphology "crack-eye", were detected, as well as microscopic hot crack nucleus we call "grain-lag cracks", caused by a grain slightly lagging behind its neighbors in arrival to the weld centerline. Varestraint test and R-test revealed a change of crack morphologies from centerline cracks to grainand cell boundary cracks with an increasing welding speed. High speed made the cracks invisible to bare eye and hardly detectable with light microscope, while electron microscope often revealed networks of fine micro-cracks.
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
CoCo is a collaborative web interface for the compilation of linguistic resources. In this demo we are presenting one of its possible applications: paraphrase acquisition.
Resumo:
The interface of MgO/Ag(001) has been studied with density functional theory applied to slabs. We have found that regular MgO films show a small adhesion to the silver substrate, the binding can be increased in off-stoichiometric regimes, either by the presence of O vacancies at the oxide film or by a small excess of O atoms at the interface between the ceramic to the metal. By means of theoretical methods, the scanning tunneling microscopy signatures of these films is also analyzed in some detail. For defect free deposits containing 1 or 2 ML and at low voltages, tunnelling takes place from the surface Ag substrate, and at large positive voltages Mg atoms are imaged. If defects, oxygen vacancies, are present on the surface of the oxide they introduce much easier channels for tunnelling resulting in big protrusions and controlling the shape of the image, the extra O stored at the interface can also be detected for very thin films.
Resumo:
La prise en charge et le suivi de personnes en situation de handicap mental souffrant de troubles psychiques et se trouvant donc à l'interface des domaines socio:éducatif et psychiatrique, constituent des défis complexes en matière de collaboration interprofessionnelle. Dans le canton de Vaud, les acteurs concernés par ce problème s'efforcent depuis de nombreuses années de créer des réseaux pluridisciplinaires visant un meilleur échange entre professionnels et le développement de compétences et de connaissances permettant d'améliorer le bien:être des bénéficiaires. Ce travail se propose ainsi d'étudier et de questionner ces modalités de travail dans une perspective socioculturelle (Vygotski, 1934/1997), afin d'en comprendre le fonctionnement, d'en éclairer les mécanismes et de fournir des pistes de réflexion aux professionnels. Il repose sur un travail de terrain mené auprès des membres du Dispositif de Collaboration Psychiatrie Handicap Mental (DCPHM) du Département de psychiatrie du CHUV, dont la mission principale est de faciliter la collaboration entre les institutions socio:éducatives et psychiatriques spécialisées dans le suivi des personnes en situation de handicap mental et souffrant de troubles psychiques. Le travail empirique est basé sur une approche qualitative et compréhensive des interactions sociales, et procède par une étude de terrain approfondie. Les données recueillies sont variées : notes de terrain et récolte de documentation, enregistrement de réunions d'équipe au sein du DCPHM et de réunions de réseau, et entretiens de différents types. L'analyse montre que le travail de collaboration qui incombe à l'équipe est constitué d'obstacles qui sont autant d'occasions de développement professionnel et de construction identitaire. Les résultats mettent en lumière des mécanismes discursifs de catégorisation concourant à la fois à la construction des patients comme objets d'activité, et à la construction d'une place qui légitime les interventions de l'équipe dans le paysage socio:éducatif et psychiatrique vaudois et la met au centre de l'arène professionnelle. -- Care and follow:up for people with mental disabilities suffering from psychological disorders : therefore at the interface between the socio:educational and psychiatric fields : represent complex challenges in terms of interprofessional collaboration. In the canton of Vaud, the caregivers involved in this issue have been trying for years to build multidisciplinary networks in order to better exchange between professionals and develop skills and knowledge to improve the recipients' well:being. This work thus proposes to study and question these working methods in a sociocultural perspective (Vygotski, 1934/1997) so as to understand how they operate, highlight inherent mechanisms and provide actionable insights to the professionals. It is based on fieldwork conducted among members of the Dispositif de Collaboration Psychiatrie Handicap Mental (DCPHM), of the Psychiatry Department at the CHUV University Hospital in Lausanne, whose main mission is to facilitate collaboration between the socio:educational and psychiatric institutions specialising in monitoring people presenting with both mental handicap and psychiatric disorder. The empirical work is based on a qualitative and comprehensive approach to social interactions, and conducted based on an in:depth field study. The data collected are varied - field notes and documentation collection, recordings of team meetings within the DCPHM and network meetings, and various types of interviews. The analysis shows that the collaborative work that befalls the team consists of obstacles, all of which provide opportunities for professional development and identity construction. The results highlight discursive strategies of categorisation which contribute both to the construction of the patients as objects of activity and to building a position that legitimates the team's interventions in the socio: educational and psychiatric landscape of canton Vaud and puts it in the centre of the professional arena.
Resumo:
Problems related to fire hazard and fire management have become in recent decades one of the most relevant issues in the Wildland-Urban Interface (WUI), that is the area where human infrastructures meet or intermingle with natural vegetation. In this paper we develop a robust geospatial method for defining and mapping the WUI in the Alpine environment, where most interactions between infrastructures and wildland vegetation concern the fire ignition through human activities, whereas no significant threats exist for infrastructures due to contact with burning vegetation. We used the three Alpine Swiss cantons of Ticino, Valais and Grisons as the study area. The features representing anthropogenic infrastructures (urban or infrastructural components of the WUI) as well as forest cover related features (wildland component of the WUI) were selected from the Swiss Topographic Landscape Model (TLM3D). Georeferenced forest fire occurrences derived from the WSL Swissfire database were used to define suitable WUI interface distances. The Random Forest algorithm was applied to estimate the importance of predictor variables to fire ignition occurrence. This revealed that buildings and drivable roads are the most relevant anthropogenic components with respect to fire ignition. We consequently defined the combination of drivable roads and easily accessible (i.e. 100 m from the next drivable road) buildings as the WUI-relevant infrastructural component. For the definition of the interface (buffer) distance between WUI infrastructural and wildland components, we computed the empirical cumulative distribution functions (ECDF) of the percentage of ignition points (observed and simulated) arising at increasing distances from the selected infrastructures. The ECDF facilitates the calculation of both the distance at which a given percentage of ignition points occurred and, in turn, the amount of forest area covered at a given distance. Finally, we developed a GIS ModelBuilder routine to map the WUI for the selected buffer distance. The approach was found to be reproducible, robust (based on statistical analyses for evaluating parameters) and flexible (buffer distances depending on the targeted final area covered) so that fire managers may use it to detect WUI according to their specific priorities.
Resumo:
Due to source contamination and wearing of instrument components problems caused by the direct insertion probe technique, a new way of introduction of low volatile compounds into mass spectrometer was tested. This new scheme comprises the introduction of the low volatile compounds solutions via a six port valve connected to a particle beam interface. Solutions of isatin were injected into this system and the best results were obtained with CH2Cl2, CH3OH and CH3CN. The solution inlet system has shown to be advantageous over the conventional way of direct insertion probe introduction.
Resumo:
Network virtualisation is considerably gaining attentionas a solution to ossification of the Internet. However, thesuccess of network virtualisation will depend in part on how efficientlythe virtual networks utilise substrate network resources.In this paper, we propose a machine learning-based approachto virtual network resource management. We propose to modelthe substrate network as a decentralised system and introducea learning algorithm in each substrate node and substrate link,providing self-organization capabilities. We propose a multiagentlearning algorithm that carries out the substrate network resourcemanagement in a coordinated and decentralised way. The taskof these agents is to use evaluative feedback to learn an optimalpolicy so as to dynamically allocate network resources to virtualnodes and links. The agents ensure that while the virtual networkshave the resources they need at any given time, only the requiredresources are reserved for this purpose. Simulations show thatour dynamic approach significantly improves the virtual networkacceptance ratio and the maximum number of accepted virtualnetwork requests at any time while ensuring that virtual networkquality of service requirements such as packet drop rate andvirtual link delay are not affected.
Resumo:
In the literature on housing market areas, different approaches can be found to defining them, for example, using travel-to-work areas and, more recently, making use of migration data. Here we propose a simple exercise to shed light on which approach performs better. Using regional data from Catalonia, Spain, we have computed housing market areas with both commuting data and migration data. In order to decide which procedure shows superior performance, we have looked at uniformity of prices within areas. The main finding is that commuting algorithms present more homogeneous areas in terms of housing prices.