964 resultados para multi-purpose optimisation
Resumo:
For people with disabilities, however, housing options have been limited. Today, state and federal laws are changing this. Who will benefit? All of us. For “accessibility” is an issue that, at one time or another, affects us all. This is true whether _ temporarily or permanently _ we use wheelchairs, need grab bars, cannot climb stairs, require easy-to-reach shelves, or rely on easy-to-navigate living spaces. The primary purpose of accessible housing law is to prevent discrimination against people with disabilities, but the end result is a living environment that is more usable for everyone. For example, both the very young and the very old will find an accessible dwelling more comfortable. People with temporary limitations due to injury or illness will find it easier to live in. Such a home will be more welcoming to guests with disabilities.
Resumo:
The purpose of this article was to review the strategies to control patient dose in adult and pediatric computed tomography (CT), taking into account the change of technology from single-detector row CT to multi-detector row CT. First the relationships between computed tomography dose index, dose length product, and effective dose in adult and pediatric CT are revised, along with the diagnostic reference level concept. Then the effect of image noise as a function of volume computed tomography dose index, reconstructed slice thickness, and the size of the patient are described. Finally, the potential of tube current modulation CT is discussed.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Multi-phase postmortem CT angiography (MPMCTA) is recognized as a valuable tool to explore the vascular system, with higher sensitivity than conventional autopsy. However, a limitation is the impossibility to diagnose pulmonary embolism (PE) due to post-mortem blood clots situated in pulmonary arteries. The purpose of this study was to explore an eventual possibility to distinguish between real PE and artefacts mimicking PE. Our study included 416 medico-legal cases. All of them underwent MPMCTA, conventional autopsy and histological examination. We selected cases presenting arterial luminal filling defects in the pulmonary arteries. Their radiological interpretation was confronted to the one of autopsy and histological examination. We also investigated an eventual correlation between artefacts in pulmonary arteries and those in other parts of the vascular system. In 123 cases, filling defects of pulmonary arteries were described during MPMCTA. In 57 cases, this was interpreted as artefact and in 4 cases as suspected PE. In 62 cases only a differential diagnosis was made. Autopsy and histology could clearly identify the artefacts as such. Only one case of real PE was radiologically misinterpreted as artefact. In 6 of the 62 cases with no interpretation a PE was diagnosed. In 3 out of 4 suspected cases, PE was confirmed. We found out that filling defects in pulmonary arteries are nearly always associated to other vascular artefacts. Therefore, we suggest following some rules for radiological interpretation in order to allow a reliable diagnosis of pulmonary embolism after MPMCTA.
Resumo:
PURPOSE: Intraoperative adverse events significantly influence morbidity and mortality of laparoscopic colorectal resections. Over an 11-year period, the changes of occurrence of such intraoperative adverse events were assessed in this study. METHODS: Analysis of 3,928 patients undergoing elective laparoscopic colorectal resection based on the prospective database of the Swiss Association of Laparoscopic and Thoracoscopic Surgery was performed. RESULTS: Overall, 377 intraoperative adverse events occurred in 329 patients (overall incidence of 8.4 %). Of 377 events, 163 (43 %) were surgical complications and 214 (57 %) were nonsurgical adverse events. Surgical complications were iatrogenic injury to solid organs (n = 63; incidence of 1.6 %), bleeding (n = 62; 1.6 %), lesion by puncture (n = 25; 0.6 %), and intraoperative anastomotic leakage (n = 13; 0.3 %). Of note, 11 % of intraoperative organ/puncture lesions requiring re-intervention were missed intraoperatively. Nonsurgical adverse events were problems with equipment (n = 127; 3.2 %), anesthetic problems (n = 30; 0.8 %), and various (n = 57; 1.5 %). Over time, the rate of intraoperative adverse events decreased, but not significantly. Bleeding complications significantly decreased (p = 0.015), and equipment problems increased (p = 0.036). However, the rate of adverse events requiring conversion significantly decreased with time (p < 0.001). Patients with an intraoperative adverse event had a significantly higher rate of postoperative local and general morbidity (41.2 and 32.9 % vs. 18.0 and 17.2 %, p < 0.001 and p < 0.001, respectively). CONCLUSIONS: Intraoperative surgical complications and adverse events in laparoscopic colorectal resections did not change significantly over time and are associated with an increased postoperative morbidity.
Resumo:
La segmentació de persones es molt difícil a causa de la variabilitat de les diferents condicions, com la postura que aquestes adoptin, color del fons, etc. Per realitzar aquesta segmentació existeixen diferents tècniques, que a partir d'una imatge ens retornen un etiquetat indicant els diferents objectes presents a la imatge. El propòsit d'aquest projecte és realitzar una comparativa de les tècniques recents que permeten fer segmentació multietiqueta i que son semiautomàtiques, en termes de segmentació de persones. A partir d'un etiquetatge inicial idèntic per a tots els mètodes utilitzats, s'ha realitzat una anàlisi d'aquests, avaluant els seus resultats sobre unes dades publiques, analitzant 2 punts: el nivell de interacció i l'eficiència.
Resumo:
PURPOSE: To assess the technical feasibility of multi-detector row computed tomographic (CT) angiography in the assessment of peripheral arterial bypass grafts and to evaluate its accuracy and reliability in the detection of graft-related complications, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. MATERIALS AND METHODS: Four-channel multi-detector row CT angiography was performed in 65 consecutive patients with 85 peripheral arterial bypass grafts. Each bypass graft was divided into three segments (proximal anastomosis, course of the graft body, and distal anastomosis), resulting in 255 segments. Two readers evaluated all CT angiograms with regard to image quality and the presence of bypass graft-related abnormalities, including graft stenosis, aneurysmal changes, and arteriovenous fistulas. The results were compared with McNemar test with Bonferroni correction. CT attenuation values were recorded at five different locations from the inflow artery to the outflow artery of the bypass graft. These findings were compared with the findings at duplex ultrasonography (US) in 65 patients and the findings at conventional digital subtraction angiography (DSA) in 27. RESULTS: Image quality was rated as good or excellent in 250 (98%) and in 252 (99%) of 255 bypass segments, respectively. There was excellent agreement both between readers and between CT angiography and duplex US in the detection of graft stenosis, aneurysmal changes, and arteriovenous fistulas (kappa = 0.86-0.99). CT angiography and duplex US were compared with conventional DSA, and there was no statistically significant difference (P >.25) in sensitivity or specificity between CT angiography and duplex US for both readers for detection of hemodynamically significant bypass stenosis or occlusion, aneurysmal changes, or arteriovenous fistulas. Mean CT attenuation values ranged from 232 HU in the inflow artery to 281 HU in the outflow artery of the bypass graft. CONCLUSION: Multi-detector row CT angiography may be an accurate and reliable technique after duplex US in the assessment of peripheral arterial bypass grafts and detection of graft-related complications, including stenosis, aneurysmal changes, and arteriovenous fistulas.
Resumo:
PURPOSE: Recurrent head and neck cancer is associated to a poor survival prognosis. A high toxicity rate is demonstrated when surgery and/or radiotherapy and/or chemotherapy are combined. Furthermore, the duration of treatment is often not ethically compatible with the expected survival (median survival<1year). Normal tissues tolerance limits the use of reirradiation and stereotactic body radiotherapy (SBRT) could offer precise irradiation while sparing healthy tissues. After completion of a feasibility study, results of a multicentric study (Lille, Nancy & Nice) using SBRT with cetuximab are reported. The aim of the study was to deliver non toxic short course SBRT (2weeks) in order to get the same local control as the one demonstrated with longer protocols. METHODS AND MATERIALS: Patients with inoperable recurrent, or new primary tumor in a previously irradiated area, were included (WHO<3). Reirradiation (RT) dose was 36Gy in six fractions of 6Gy to the 85% isodose line covering 95% of the PTV with 5 injections of concomitant cetuximab (CT). All patients had previous radiotherapy, 85% had previous surgery and 48% previous chemotherapy. RESULTS: Between 11/2007 and 08/2010, 60 were included (46 men and 14 women), 56 received CT+RT, 3 were not treated and 1 received only CT. Median age was 60 (42-87)) and all 56 patients had squamous carcinoma and received concomitant cetuximab. Mean time between previous radiotherapy and the start of SBRT was 38months. Cutaneous toxicity was observed for 41 patients. There was one toxic death from hemorrhage and denutrition. Median follow-up was 11.4months. At 3months, response rate was 58.4% (95% CI: 43.2-72.4%) and disease control rate was 91.7% (95% CI: 80.0-97.7%). The one-year OS rate was 47.5% (95% CI: 30.8-62.4). CONCLUSION: These results suggest that short SBRT with cetuximab is an effective salvage treatment with good response rate in this poor prognosis population with previously irradiated HNC. Treatment is feasible and, with appropriate care to limiting critical structure, acute toxicities are acceptable. This combination may be the reference treatment is this population.
Resumo:
Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.
Resumo:
The purpose of this dissertation is to increase the understanding and knowledge of field sales management control systems (i.e. sales managers monitoring, directing, evaluating and rewarding activities) and their potential consequences on salespeople. This topic is important because research conducted in the past has indicated that the choice of control system type can on the other hand have desirable consequences, such as high levels of motivation and performance, and on the other hand leadto harmful unintended consequences, such as opportunistic or unethical behaviors. Despite the fact that marketing and sales management control systems have been under rigorous research for over two decades, it still is at a very early stage of development, and several inconsistencies can be found in the research results. This dissertation argues that these inconsistencies are mainly derived from misspecification of the level of analysis in the past research. These different levels of analysis (i.e. strategic, tactical, and operational levels) involve very different decision-making situations regarding the control and motivation of sales force, which should be taken into consideration when conceptualizing the control. Moreover, the study of salesperson consequences of a field sales management control system is actually a cross-level phenomenon, which means that at least two levels of analysis are simultaneously involved. The results of this dissertation confirm the need to re-conceptualize the field sales management control system concept. It provides empirical evidence for the assertion that control should be conceptualized with more details atthe tactical/operational level of analysis than at the strategic levelof analysis. Moreover, the results show that some controls are more efficiently communicated to field salespeople than others. It is proposed that this difference is due to different purposes of control; some controls aredesigned for influencing salespersons' behavior (aim at motivating) whereas some controls are designed to aid decision-making (aim at providing information). According to the empirical results of this dissertation, the both types of controls have an impact to the sales force, but this impactis not as strong as expected. The results obtained in this dissertation shed some light to the nature of field sales management control systems, and their consequences on salespeopl
Resumo:
Data traffic caused by mobile advertising client software when it is communicating with the network server can be a pain point for many application developers who are considering advertising-funded application distribution, since the cost of the data transfer might scare their users away from using the applications. For the thesis project, a simulation environment was built to mimic the real client-server solution for measuring the data transfer over varying types of connections with different usage scenarios. For optimising data transfer, a few general-purpose compressors and XML-specific compressors were tried for compressing the XML data, and a few protocol optimisations were implemented. For optimising the cost, cache usage was improved and pre-loading was enhanced to use free connections to load the data. The data traffic structure and the various optimisations were analysed, and it was found that the cache usage and pre-loading should be enhanced and that the protocol should be changed, with report aggregation and compression using WBXML or gzip.
Resumo:
The purpose of this study was to develop a rapid, simple and sensitive quantitation method for pseudoephedrine (PSE), paracetamol (PAR) and loratadine (LOR) in plasma and pharmaceuticals using liquid chromatography-tandem mass spectrometry with a monolithic column. Separation was achieved using a gradient composition of methanol-0.1% formic acid at a flow rate of 1.0 mL min-1. Mass spectral transitions were recorded in SRM mode. System validation was evaluated for precision, specificity and linearity. Limit of detection for pseudoephedrine, paracetamol, and loratadine were determined to be 3.14, 1.86 and 1.44 ng mL-1, respectively, allowing easy determination in plasma with % recovery of 93.12 to 101.56%.
Resumo:
Nanotubes are one of the most perspective materials in modern nanotechologies. It makes present investigation very actual. In this work magnetic properties of multi-walled nanotubes on polystyrene substrate are investigated by using quantum magnetometer SQUID. Main purpose was to obtain magnetic field and temperature dependences of magnetization and to compare them to existing theoretical models of magnetism in carbon-bases structures. During data analysis a mathematical algorithm for obtained data filtration was developed because measurement with quantum magnetometer assume big missives of number data, which contain accidental errors. Nature of errors is drift of SQUID signal, errors of different parts of measurement station. Nanotube samples on polystyrene substrate were studied with help of atomic force microscope. On the surface traces of nanotube were found contours, which were oriented in horizontal plane. This feature was caused by rolling method for samples. Detailed comparison of obtained dependences with information of other researches on this topic allows to obtain some conclusions about nature of magnetism in the samples. It emphasizes importance and actuality of this scientific work.
Resumo:
This thesis addresses the coolability of porous debris beds in the context of severe accident management of nuclear power reactors. In a hypothetical severe accident at a Nordic-type boiling water reactor, the lower drywell of the containment is flooded, for the purpose of cooling the core melt discharged from the reactor pressure vessel in a water pool. The melt is fragmented and solidified in the pool, ultimately forming a porous debris bed that generates decay heat. The properties of the bed determine the limiting value for the heat flux that can be removed from the debris to the surrounding water without the risk of re-melting. The coolability of porous debris beds has been investigated experimentally by measuring the dryout power in electrically heated test beds that have different geometries. The geometries represent the debris bed shapes that may form in an accident scenario. The focus is especially on heap-like, realistic geometries which facilitate the multi-dimensional infiltration (flooding) of coolant into the bed. Spherical and irregular particles have been used to simulate the debris. The experiments have been modeled using 2D and 3D simulation codes applicable to fluid flow and heat transfer in porous media. Based on the experimental and simulation results, an interpretation of the dryout behavior in complex debris bed geometries is presented, and the validity of the codes and models for dryout predictions is evaluated. According to the experimental and simulation results, the coolability of the debris bed depends on both the flooding mode and the height of the bed. In the experiments, it was found that multi-dimensional flooding increases the dryout heat flux and coolability in a heap-shaped debris bed by 47–58% compared to the dryout heat flux of a classical, top-flooded bed of the same height. However, heap-like beds are higher than flat, top-flooded beds, which results in the formation of larger steam flux at the top of the bed. This counteracts the effect of the multi-dimensional flooding. Based on the measured dryout heat fluxes, the maximum height of a heap-like bed can only be about 1.5 times the height of a top-flooded, cylindrical bed in order to preserve the direct benefit from the multi-dimensional flooding. In addition, studies were conducted to evaluate the hydrodynamically representative effective particle diameter, which is applied in simulation models to describe debris beds that consist of irregular particles with considerable size variation. The results suggest that the effective diameter is small, closest to the mean diameter based on the number or length of particles.
Resumo:
Case company utilizes multi-branding strategy (or house of brands strategy) in its product portfolio. In practice the company has multiple brands – one main brand and four acquired brands – which all utilize one single product platform. The objective of this research is to analyze case company’s multi-branding strategy and its benefits and challenges. Moreover, the purpose is to clarify that how could a company in B2B markets utilize multi-branding strategy more efficiently and profitably. The theoretical part of this thesis consists of aspects of branding strategies; different brand name architectures, benefits and challenges of different strategies and different ways of utilize branding strategies in mergers and acquisitions. The empirical part, on the other hand, includes the description of the case company’s branding strategy and the employees’ perspective on the benefits and challenges of multi-branding strategy, and how to utilize it more efficiently and profitably. This study shows, that the major benefits of utilizing multi-branding are lower production costs, ability to reach wider market coverage, possibility to utilize common sales tools, synergies in R&D and shared resources. On the other hand, the major challenges are lack of product differentiation, internal competition, branding issues in production and deliveries, pricing issues and conflicts, and compromises in product compatibility and suitability. Based on the results, several ways to utilize multi-branding strategy more efficiently and profitably were found; by putting more effort on brand image and product differentiation, by having more co-operation among the brands and by focusing on more precise customer and market segmentation.