998 resultados para remote desktop software
Resumo:
Purpose: Tumour-free resection margins (RMs) are mandatory in breast-conserving surgery. On-site intraoperative ultrasound (US)-guided tumour resection with extemporaneous histopathological assessment of RMs has been described. Remote intraoperative US assessment of RMs is an alternative. The purpose of this study was to evaluate the relationship of lumpectomy RMs measurements between remote intraoperative US and postoperative histopathology.Methods and Materials: In a retrospective IRB-approved review of 100 consecutive lumpectomies performed between October 2009 and April 2011 for presumed non-palpable breast cancer, 71 women (mean age 63.8years) were included. Twenty-nine patients were excluded because of absence of cancer at histopathology and/or incomplete data. Measurements of lumpectomy minimal RMs and tumour maximal diameter obtained on remote intraoperative US and postoperative histopathology were compared.Results: Minimal RMs were 0.35±0.32 (mean±SD) and 0.35±0.32cm on remote intraoperative US and postoperative histopathology, respectively. No significant difference was found between these measurements (p=0.37). Tumour maximal diameter was 1.02±0.51 (mean±SD) and 1.33±0.74cm on remote intraoperative US and postoperative histopathology, respectively. US measurements were significantly smaller (p<0.001). The 71 breast carcinoma (CA) consisted of: invasive canalar (n=49), invasive lobular (n=11), in situ (n=3) and other types of CA (n=8). Twenty-nine patients had intraoperative re-excision (24 without residual CA), while 16 patients were re-operated due to insufficient histopathological RMs (12 without residual CA).Conclusion: Good correlation of minimal RMs between remote intraoperative US and postoperative histopathology warrants use of both techniques in a complementary manner. Remote intraoperative US is helpful in taking rapid decision of re-excision and maintaining low re-operation rate after breast-conserving surgery for non-palpable cancer.
Resumo:
Field-based soil moisture measurements are cumbersome. Thus, remote sensing techniques are needed because allows field and landscape-scale mapping of soil moisture depth-averaged through the root zone of existing vegetation. The objective of the study was to evaluate the accuracy of an empirical relationship to calculate soil moisture from remote sensing data of irrigated soils of the Apodi Plateau, in the Brazilian semiarid region. The empirical relationship had previously been tested for irrigated soils in Mexico, Egypt, and Pakistan, with promising results. In this study, the relationship was evaluated from experimental data collected from a cotton field. The experiment was carried out in an area of 5 ha with irrigated cotton. The energy balance and evaporative fraction (Λ) were measured by the Bowen ratio method. Soil moisture (θ) data were collected using a PR2 - Profile Probe (Delta-T Devices Ltd). The empirical relationship was tested using experimentally collected Λ and θ values and was applied using the Λ values obtained from the Surface Energy Balance Algorithm for Land (SEBAL) and three TM - Landsat 5 images. There was a close correlation between measured and estimated θ values (p<0.05, R² = 0.84) and there were no significant differences according to the Student t-test (p<0.01). The statistical analyses showed that the empirical relationship can be applied to estimate the root-zone soil moisture of irrigated soils, i.e. when the evaporative fraction is greater than 0.45.
Resumo:
Peatlands are soil environments that store carbon and large amounts of water, due to their composition (90 % water), low hydraulic conductivity and a sponge-like behavior. It is estimated that peat bogs cover approximately 4.2 % of the Earth's surface and stock 28.4 % of the soil carbon of the planet. Approximately 612 000 ha of peatlands have been mapped in Brazil, but the peat bogs in the Serra do Espinhaço Meridional (SdEM) were not included. The objective of this study was to map the peat bogs of the northern part of the SdEM and estimate the organic matter pools and water volume they stock. The peat bogs were pre-identified and mapped by GIS and remote sensing techniques, using ArcGIS 9.3, ENVI 4.5 and GPS Track Maker Pro software and the maps validated in the field. Six peat bogs were mapped in detail (1:20,000 and 1:5,000) by transects spaced 100 m and each transect were determined every 20 m, the UTM (Universal Transverse Mercator) coordinates, depth and samples collected for characterization and determination of organic matter, according to the Brazilian System of Soil Classification. In the northern part of SdEM, 14,287.55 ha of peatlands were mapped, distributed over 1,180,109 ha, representing 1.2 % of the total area. These peatlands have an average volume of 170,021,845.00 m³ and stock 6,120,167 t (428.36 t ha-1) of organic matter and 142,138,262 m³ (9,948 m³ ha-1) of water. In the peat bogs of the Serra do Espinhaço Meridional, advanced stages of decomposing (sapric) organic matter predominate, followed by the intermediate stage (hemic). The vertical growth rate of the peatlands ranged between 0.04 and 0.43 mm year-1, while the carbon accumulation rate varied between 6.59 and 37.66 g m-2 year-1. The peat bogs of the SdEM contain the headwaters of important water bodies in the basins of the Jequitinhonha and San Francisco Rivers and store large amounts of organic carbon and water, which is the reason why the protection and preservation of these soil environments is such an urgent and increasing need.
Resumo:
In the search for high efficiency in root studies, computational systems have been developed to analyze digital images. ImageJ and Safira are public-domain systems that may be used for image analysis of washed roots. However, differences in root properties measured using ImageJ and Safira are supposed. This study compared values of root length and surface area obtained with public-domain systems with values obtained by a reference method. Root samples were collected in a banana plantation in an area of a shallower Typic Carbonatic Haplic Cambisol (CXk), and an area of a deeper Typic Haplic Ta Eutrophic Cambisol (CXve), at six depths in five replications. Root images were digitized and the systems ImageJ and Safira used to determine root length and surface area. The line-intersect method modified by Tennant was used as reference; values of root length and surface area measured with the different systems were analyzed by Pearson's correlation coefficient and compared by the confidence interval and t-test. Both systems ImageJ and Safira had positive correlation coefficients with the reference method for root length and surface area data in CXk and CXve. The correlation coefficient ranged from 0.54 to 0.80, with lowest value observed for ImageJ in the measurement of surface area of roots sampled in CXve. The IC (95 %) revealed that root length measurements with Safira did not differ from that with the reference method in CXk (-77.3 to 244.0 mm). Regarding surface area measurements, Safira did not differ from the reference method for samples collected in CXk (-530.6 to 565.8 mm²) as well as in CXve (-4231 to 612.1 mm²). However, measurements with ImageJ were different from those obtained by the reference method, underestimating length and surface area in samples collected in CXk and CXve. Both ImageJ and Safira allow an identification of increases or decreases in root length and surface area. However, Safira results for root length and surface area are closer to the results obtained with the reference method.
Resumo:
Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.
Resumo:
Following a high wind event on January 24, 2006, at least five people claimed to have seen or felt the superstructure of the Saylorville Reservoir Bridge in central Iowa moving both vertically and laterally. Since that time, the Iowa Department of Transportation (DOT) contracted with the Bridge Engineering Center at Iowa State University to design and install a monitoring system capable of providing notification of the occurrence of subsequent high wind events. In subsequent years, a similar system was installed on the Red Rock Reservoir Bridge to provide the same wind monitoring capabilities and notifications to the Iowa DOT. The objectives of the system development and implementation are to notify personnel when the wind speed reaches a predetermined threshold such that the bridge can be closed for the safety of the public, correlate structural response with wind-induced response, and gather historical wind data at these structures for future assessments. This report describes the two monitoring systems, their components, upgrades, functionality, and limitations, and results from one year of wind data collection at both bridges.
Resumo:
We study a Kuramoto model in which the oscillators are associated with the nodes of a complex network and the interactions include a phase frustration, thus preventing full synchronization. The system organizes into a regime of remote synchronization where pairs of nodes with the same network symmetry are fully synchronized, despite their distance on the graph. We provide analytical arguments to explain this result, and we show how the frustration parameter affects the distribution of phases. An application to brain networks suggests that anatomical symmetry plays a role in neural synchronization by determining correlated functional modules across distant locations.
Resumo:
Posttransplant lymphoproliferative disorder (PTLD) is a potentially fatal complication of solid organ transplantation. The majority of PTLD is of B-cell origin, and 90% are associated with the Epstein-Barr virus (EBV). Lymphomatoid granulomatosis (LG) is a rare, EBV-associated systemic angiodestructive lymphoproliferative disorder, which has rarely been described in patients with renal transplantation. We report the case of a patient with renal transplantation for SLE, who presented, 9 months after renal transplantation, an EBV-associated LG limited to the intracranial structures that recovered completely after adjustment of her immunosuppressive treatment. Nine years later, she developed a second PTLD disorder with central nervous system initial manifestation. Workup revealed an EBV-positive PTLD Burkitt lymphoma, widely disseminated in most organs. In summary, the reported patient presented two lymphoproliferative disorders (LG and Burkitt's lymphoma), both with initial neurological manifestation, at 9 years interval. With careful reduction of the immunosuppression after the first manifestation and with the use of chemotherapy combined with radiotherapy after the second manifestation, our patient showed complete disappearance of neurologic symptoms and she is clinically well with good kidney function. No recurrence has been observed by radiological imaging until now.
Resumo:
The objective of this project was to promote and facilitate analysis and evaluation of the impacts of road construction activities in Smart Work Zone Deployment Initiative (SWZDI) states. The two primary objectives of this project were to assess urban freeway work-zone impacts through use of remote monitoring devices, such as radar-based traffic sensors, traffic cameras, and traffic signal loop detectors, and evaluate the effectiveness of using these devices for such a purpose. Two high-volume suburban freeway work zones, located on Interstate 35/80 (I-35/I-80) through the Des Moines, Iowa metropolitan area, were evaluated at the request of the Iowa Department of Transportation (DOT).
Resumo:
Following high winds on January 24, 2006, at least five people claimed to have seen or felt the superstructure of the Saylorville Reservoir Bridge in central Iowa moving both vertically and laterally. Since that time, the Iowa Department of Transportation (DOT) contracted with the Bridge Engineering Center at Iowa State University to design and install a monitoring system capable of providing notification of the occurrence of subsequent high winds. Although measures were put into place following the 2006 event at the Saylorville Reservoir Bridge, knowledge of the performance of this bridge during high wind events was incomplete. Therefore, the Saylorville Reservoir Bridge was outfitted with an information management system to investigate the structural performance of the structure and the potential for safety risks. In subsequent years, given the similarities between the Saylorville and Red Rock Reservoir bridges, a similar system was added to the Red Rock Reservoir Bridge southeast of Des Moines. The monitoring system developed and installed on these two bridges was designed to monitor the wind speed and direction at the bridge and, via a cellular modem, send a text message to Iowa DOT staff when wind speeds meet a predetermined threshold. The original intent was that, once the text message is received, the bridge entrances would be closed until wind speeds diminish to safe levels.
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.