931 resultados para change-point detection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to improve the efficacy and safety of treatments, drug dosage needs to be adjusted to the actual needs of each patient in a truly personalized medicine approach. Key for widespread dosage adjustment is the availability of point-of-care devices able to measure plasma drug concentration in a simple, automated, and cost-effective fashion. In the present work, we introduce and test a portable, palm-sized transmission-localized surface plasmon resonance (T-LSPR) setup, comprised of off-the-shelf components and coupled with DNA-based aptamers specific to the antibiotic tobramycin (467 Da). The core of the T-LSPR setup are aptamer-functionalized gold nanoislands (NIs) deposited on a glass slide covered with fluorine-doped tin oxide (FTO), which acts as a biosensor. The gold NIs exhibit localized plasmon resonance in the visible range matching the sensitivity of the complementary metal oxide semiconductor (CMOS) image sensor employed as a light detector. The combination of gold NIs on the FTO substrate, causing NIs size and pattern irregularity, might reduce the overall sensitivity but confers extremely high stability in high-ionic solutions, allowing it to withstand numerous regeneration cycles without sensing losses. With this rather simple T-LSPR setup, we show real-time label-free detection of tobramycin in buffer, measuring concentrations down to 0.5 μM. We determined an affinity constant of the aptamer-tobramycin pair consistent with the value obtained using a commercial propagating-wave based SPR. Moreover, our label-free system can detect tobramycin in filtered undiluted blood serum, measuring concentrations down to 10 μM with a theoretical detection limit of 3.4 μM. While the association signal of tobramycin onto the aptamer is masked by the serum injection, the quantification of the captured tobramycin is possible during the dissociation phase and leads to a linear calibration curve for the concentrations over the tested range (10-80 μM). The plasmon shift following surface binding is calculated in terms of both plasmon peak location and hue, with the latter allowing faster data elaboration and real-time display of the results. The presented T-LSPR system shows for the first time label-free direct detection and quantification of a small molecule in the complex matrix of filtered undiluted blood serum. Its uncomplicated construction and compact size, together with the remarkable performances, represent a leap forward toward effective point-of-care devices for therapeutic drug concentration monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

CONTEXT: The current standard for diagnosing prostate cancer in men at risk relies on a transrectal ultrasound-guided biopsy test that is blind to the location of the cancer. To increase the accuracy of this diagnostic pathway, a software-based magnetic resonance imaging-ultrasound (MRI-US) fusion targeted biopsy approach has been proposed. OBJECTIVE: Our main objective was to compare the detection rate of clinically significant prostate cancer with software-based MRI-US fusion targeted biopsy against standard biopsy. The two strategies were also compared in terms of detection of all cancers, sampling utility and efficiency, and rate of serious adverse events. The outcomes of different targeted approaches were also compared. EVIDENCE ACQUISITION: We performed a systematic review of PubMed/Medline, Embase (via Ovid), and Cochrane Review databases in December 2013 following the Preferred Reported Items for Systematic reviews and Meta-analysis statement. The risk of bias was evaluated using the Quality Assessment of Diagnostic Accuracy Studies-2 tool. EVIDENCE SYNTHESIS: Fourteen papers reporting the outcomes of 15 studies (n=2293; range: 13-582) were included. We found that MRI-US fusion targeted biopsies detect more clinically significant cancers (median: 33.3% vs 23.6%; range: 13.2-50% vs 4.8-52%) using fewer cores (median: 9.2 vs 37.1) compared with standard biopsy techniques, respectively. Some studies showed a lower detection rate of all cancer (median: 50.5% vs 43.4%; range: 23.7-82.1% vs 14.3-59%). MRI-US fusion targeted biopsy was able to detect some clinically significant cancers that would have been missed by using only standard biopsy (median: 9.1%; range: 5-16.2%). It was not possible to determine which of the two biopsy approaches led most to serious adverse events because standard and targeted biopsies were performed in the same session. Software-based MRI-US fusion targeted biopsy detected more clinically significant disease than visual targeted biopsy in the only study reporting on this outcome (20.3% vs 15.1%). CONCLUSIONS: Software-based MRI-US fusion targeted biopsy seems to detect more clinically significant cancers deploying fewer cores than standard biopsy. Because there was significant study heterogeneity in patient inclusion, definition of significant cancer, and the protocol used to conduct the standard biopsy, these findings need to be confirmed by further large multicentre validating studies. PATIENT SUMMARY: We compared the ability of standard biopsy to diagnose prostate cancer against a novel approach using software to overlay the images from magnetic resonance imaging and ultrasound to guide biopsies towards the suspicious areas of the prostate. We found consistent findings showing the superiority of this novel targeted approach, although further high-quality evidence is needed to change current practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reaching a consensus in terms of interchangeability and utility (i.e., disease detection/monitoring) of a medical device is the eventual aim of repeatability and agreement studies. The aim of the tolerance and relative utility indices described in this report is to provide a methodology to compare change in clinical measurement noise between different populations (repeatability) or measurement methods (agreement), so as to highlight problematic areas. No longitudinal data are required to calculate these indices. Both indices establish a metric of least to most effected across all parameters to facilitate comparison. If validated, these indices may prove useful tools when combining reports and forming the consensus required in the validation process for software updates and new medical devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: TILLING (Targeting Induced Local Lesions IN Genomes) is a reverse genetic method that combines chemical mutagenesis with high-throughput genome-wide screening for point mutation detection in genes of interest. However, this mutation discovery approach faces a particular problem which is how to obtain a mutant population with a sufficiently high mutation density. Furthermore, plant mutagenesis protocols require two successive generations (M1, M2) for mutation fixation to occur before the analysis of the genotype can begin. Results: Here, we describe a new TILLING approach for rice based on ethyl methanesulfonate (EMS) mutagenesis of mature seed-derived calli and direct screening of in vitro regenerated plants. A high mutagenesis rate was obtained (i.e. one mutation in every 451 Kb) when plants were screened for two senescence-related genes. Screening was carried out in 2400 individuals from a mutant population of 6912. Seven sense change mutations out of 15 point mutations were identified. Conclusions: This new strategy represents a significant advantage in terms of time-savings (i.e. more than eight months), greenhouse space and work during the generation of mutant plant populations. Furthermore, this effective chemical mutagenesis protocol ensures high mutagenesis rates thereby saving in waste removal costs and the total amount of mutagen needed thanks to the mutagenesis volume reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The GH-2000 and GH-2004 projects have developed a method for detecting GH misuse based on measuring insulin-like growth factor-I (IGF-I) and the amino-terminal pro-peptide of type III collagen (P-III-NP). The objectives were to analyze more samples from elite athletes to improve the reliability of the decision limit estimates, to evaluate whether the existing decision limits needed revision, and to validate further non-radioisotopic assays for these markers. The study included 998 male and 931 female elite athletes. Blood samples were collected according to World Anti-Doping Agency (WADA) guidelines at various sporting events including the 2011 International Association of Athletics Federations (IAAF) World Athletics Championships in Daegu, South Korea. IGF-I was measured by the Immunotech A15729 IGF-I IRMA, the Immunodiagnostic Systems iSYS IGF-I assay and a recently developed mass spectrometry (LC-MS/MS) method. P-III-NP was measured by the Cisbio RIA-gnost P-III-P, Orion UniQ? PIIINP RIA and Siemens ADVIA Centaur P-III-NP assays. The GH-2000 score decision limits were developed using existing statistical techniques. Decision limits were determined using a specificity of 99.99% and an allowance for uncertainty because of the finite sample size. The revised Immunotech IGF-I - Orion P-III-NP assay combination decision limit did not change significantly following the addition of the new samples. The new decision limits are applied to currently available non-radioisotopic assays to measure IGF-I and P-III-NP in elite athletes, which should allow wider flexibility to implement the GH-2000 marker test for GH misuse while providing some resilience against manufacturer withdrawal or change of assays. Copyright © 2015 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monocarboxylates have been implicated in the control of energy homeostasis. Among them, the putative role of ketone bodies produced notably during high-fat diet (HFD) has not been thoroughly explored. In this study, we aimed to determine the impact of a specific rise in cerebral ketone bodies on food intake and energy homeostasis regulation. A carotid infusion of ketone bodies was performed on mice to stimulate sensitive brain areas for 6 or 12 h. At each time point, food intake and different markers of energy homeostasis were analyzed to reveal the consequences of cerebral increase in ketone body level detection. First, an increase in food intake appeared over a 12-h period of brain ketone body perfusion. This stimulated food intake was associated with an increased expression of the hypothalamic neuropeptides NPY and AgRP as well as phosphorylated AMPK and is due to ketone bodies sensed by the brain, as blood ketone body levels did not change at that time. In parallel, gluconeogenesis and insulin sensitivity were transiently altered. Indeed, a dysregulation of glucose production and insulin secretion was observed after 6 h of ketone body perfusion, which reversed to normal at 12 h of perfusion. Altogether, these results suggest that an increase in brain ketone body concentration leads to hyperphagia and a transient perturbation of peripheral metabolic homeostasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To develop procedures to ensure consistency of printing quality of digital images, by means of hardcopy quantitative analysis based on a standard image. Materials and Methods Characteristics of mammography DI-ML and general purpose DI-HL films were studied through the QC-Test utilizing different processing techniques in a FujiFilm®-DryPix4000 printer. A software was developed for sensitometric evaluation, generating a digital image including a gray scale and a bar pattern to evaluate contrast and spatial resolution. Results Mammography films showed maximum optical density of 4.11 and general purpose films, 3.22. The digital image was developed with a 33-step wedge scale and a high-contrast bar pattern (1 to 30 lp/cm) for spatial resolution evaluation. Conclusion Mammographic films presented higher values for maximum optical density and contrast resolution as compared with general purpose films. The utilized digital processing technique could only change the image pixels matrix values and did not affect the printing standard. The proposed digital image standard allows greater control of the relationship between pixels values and optical density obtained in the analysis of films quality and printing systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freshwater species worldwide are experiencing dramatic declines partly attributable to ongoing climate change. It is expected that the future effects of climate change could be particularly severe in mediterranean climate (med-) regions, which host many endemic species already under great stress from the high level of human development. In this article, we review the climate and climate-induced changes in streams of med-regions and the responses of stream biota, focusing on both observed and anticipated ecological responses. We also discuss current knowledge gaps and conservation challenges. Expected climate alterations have already been observed in the last decades, and include: increased annual average air temperatures; decreased annual average precipitation; hydrologic alterations; and an increase in frequency, intensity and duration of extreme events, such as floods, droughts and fires. Recent observations, which are concordant with forecasts built, show stream biota of med-regions when facing climate changes tend to be displaced towards higher elevations and upper latitudes, communities tend to change their composition and homogenize, while some life-history traits seem to provide biota with resilience and resistance to adapt to the new conditions (as being short-lived, small, and resistant to low streamflow and desiccation). Nevertheless, such responses may be insufficient to cope with current and future environmental changes. Accurate forecasts of biotic changes and possible adaptations are difficult to obtain in med-regions mainly because of the difficulty of distinguishing disturbances due to natural variability from the effects of climate change, particularly regarding hydrology. Long-term studies are needed to disentangle such variability and improve knowledge regarding the ecological responses and the detection of early warning signals to climate change. Investments should focus on taxa beyond fish and macroinvertebrates, and in covering the less studied regions of Chile and South Africa. Scientists, policy makers and water managers must be involved in the climate change dialogue because the freshwater conservation concerns are huge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the heavily overlapping symptoms, pathogen-specific diagnosis and treatment of infectious diseases is difficult based on clinical symptoms alone. Therefore, patients are often treated empirically. More efficient treatment and management of infectious diseases would require rapid point-of-care compatible in vitro diagnostic methods. However, current point-of-care methods are unsatisfactory in performance and in cost structure. The lack of pointof- care methods results in unnecessary use of antibiotics, suboptimal use of virus-specific drugs, and compromised patient care. In this thesis, the applicability of a two-photon excitation fluorometry is evaluated as a tool for rapid detection of infectious diseases. New separation-free immunoassay methodologies were developed and validated for the following application areas: general inflammation markers, pathogen-specific antibodies, pathogen-specific antigens, and antimicrobial susceptibility testing. In addition, dry-reagent methodology and nanoparticulate tracers are introduced in context to the technique. The results show that the new assay technique is a versatile tool for rapid detection of infectious diseases in many different application areas. One particularly attractive area is rapid multianalyte testing of respiratory infections, where the technique was shown to allow simple assay protocols and comparable performance to the state-of-the-art laboratory methods. If implemented in clinical diagnostic use, the new methods could improve diagnostic testing routines, especially in rapid testing of respiratory tract infections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present new optical and infrared photometric observations and high resolution H α spectra of the periodic radio star LSI+61◦303. The optical photometric data set covers the time interval 1985-1993 and amounts to about a hundred nights. A period of ∼26 days is found in the V band. The infrared data also present evidence for a similar periodicity, but with higher amplitude of variation ((0.m 2). The spectroscopic observations include 16 intermediate and high dispersion spectra of LSI+61◦303 collected between January 1989 and February 1993. The H α emission line profile and its variations are analyzed. Several emission line parameters -- among them the H α EW and the width of the H α red hump -- change strongly at or close to radio maximum, and may exhibit periodic variability. We also observe a significant change in the peak separation. The H α profile of LSI+61◦303 does not seem peculiar for a Be star. However, several of the observed variations of the H α profile can probably be associated with the presence of the compact, secondary star.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a result of the recent regulatory amendments and other development trends in the electricity distribution business, the sector is currently witnessing radical restructuring that will eventually impact the business logics of the sector. This report represents upcoming changes in the electricity distribution industry and concentrates on the factors that are expected to be the most fundamental ones. Electricity network companies nowadays struggle with legislative and regulatory requirements that focus on both the operational efficiency and the reliability of electricity distribution networks. The forces that have an impact on the distribution network companies can be put into three main categories that define the transformation at a general level. Those are: (1) a requirement for a more functional marketplace for energy, (2) environmental aspects (combating climate change etc.), and (3) a strongly emphasized requirement for the security of energy supply. The first point arises from the legislators’ attempt to increase competition in electricity retail markets, the second one concerns both environmental protection and human safety issues, and the third one indicates societies’ reduced willingness to accept interruptions in electricity supply. In the future, regulation of electricity distribution business may lower the threshold for building more weather-resistant networks, which in turn means increased underground cabling. This development pattern is reinforced by tightening safety and environmental regulations that ultimately make the overhead lines expensive to build and maintain. The changes will require new approaches particularly in network planning, construction, and maintenance. The concept for planning, constructing, and maintaining cable networks is necessary because the interdependencies between network operations are strong, in other words, the nature of the operation requires a linkage to other operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En termes generals, es pot definir l’Eficiència Energètica com la reducció del consum d’energia mantenint els mateixos serveis energètics, sense disminuir el nostre confort i qualitat de vida, protegint el medi ambient, assegurant el proveïment i fomentant un comportament Sostenible al seu ús. L’objectiu principal d’aquest treball, és reduir el consum d’energia i terme de potència contractat a la Universitat de Vic, aplicant un programa d’estalvi amb mesures correctores en el funcionament de les seves instal·lacions o espais. Per tal de poder arribar a aquest objectiu marcat, prèviament s’ha realitzat un estudi acurat, obtenint tota la informació necessària per poder aplicar les mesures correctores a la bossa més important de consum. Un cop trobada, dur a terme l’estudi de la viabilitat de la inversió de les mesures correctores més eficients, optimitzant els recursos destinats. L’espai on s’ha dut a terme l’estudi, ha estat a l’edifici F del Campus Miramarges, seguint les indicacions d’Arnau Bardolet (Cap de Manteniment de la UVIC). Aquest edifici consta d’un entresol, baixos i quatre plantes. L’equip de mesura que s’ha fet servir per realitzar l’estudi, és de la marca Circutor sèrie AR5-L, aquests equips són programables que mesuren, calculen i emmagatzemen en memòria els principals paràmetres elèctrics en xarxes trifàsiques. Els projectes futurs complementaris que es podrien realitzar a part d’aquest són: instal·lar sensors, instal·lar mòduls convertidors TCP/IP, aprofitar la xarxa intranet i crear un escada amb un sinòptic de control i gestió des d’un punt de treball. Aquest aplicatiu permet visualitzar en una pantalla d’un PC tots els estats dels elements controlats mitjançant un sinòptic (encendre/parar manualment l’enllumenat i endolls de les aules, estat d’enllumenat i endolls de les aules, consums instantanis/acumulats energètics, estat dels passadissos entre altres) i explotar les dades recollides a la base de dades. Cada espai tindria la seva lògica de funcionament automàtic específic. Entre les conclusions més rellevants obtingudes en aquest treball s’observa: · Que és pot reduir la potència contractada a la factura a l’estar per sota de la realment consumida. · Que no hi ha penalitzacions a la factura per consum de reactiva, ja que el compensador funciona correctament. · Que es pot reduir l’horari de l’inici del consum d’energia, ja que no correspon a l’activitat docent. · Els valors de la tensió i freqüència estan dintre de la normalitat. · Els harmònics estan al llindar màxim. Analitzant aquestes conclusions, voldria destacar les mesures correctores més importants que es poden dur a terme: canvi tecnològic a LED, temporitzar automàticament l’encesa i apagada dels fluorescents i equips informàtics de les aules “seguint calendari docent”, instal·lar sensors de moviment amb detecció lumínica als passadissos. Totes les conclusions extretes d’aquest treball, es poden aplicar a tots els edificis de la facultat, prèviament realitzant l’estudi individual de cadascuna, seguint els mateixos criteris per tal d’optimitzar la inversió.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple cloud point extraction procedure is presented for the preconcentration of copper in various samples. After complexation by 4-hydroxy-2-mercapto-6-propylpyrimidine (PTU), copper ions are quantitatively extracted into the phase rich in Triton X-114 after centrifugation. Methanol acidified with 0.5 mol L-1 HNO3 was added to the surfactant-rich phase prior to its analysis by flame atomic absorption spectrometry (FAAS). Analytical parameters including concentrations for PTU, Triton X-114 and HNO3, bath temperature, centrifugation rate and time were optimized. The influences of the matrix ions on the recoveries of copper ions were investigated. The detection limits (3SDb/m, n=4) of 1.6 ng mL-1 along with enrichment factors of 30 for Cu were achieved. The proposed procedure was applied to the analysis of environmental samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple, sensitive and selective cloud point extraction procedure is described for the preconcentration and atomic absorption spectrometric determination of Zn2+ and Cd2+ ions in water and biological samples, after complexation with 3,3',3",3'"-tetraindolyl (terephthaloyl) dimethane (TTDM) in basic medium, using Triton X-114 as nonionic surfactant. Detection limits of 3.0 and 2.0 µg L-1 and quantification limits 10.0 and 7.0 µg L-1were obtained for Zn2+ and Cd2+ ions, respectively. Relative standard deviation was 2.9 and 3.3, and enrichment factors 23.9 and 25.6, for Zn2+ and Cd2+ ions, respectively. The method enabled determination of low levels of Zn2+ and Cd2+ ions in urine, blood serum and water samples.