832 resultados para pacs: design and graphics it applications
Resumo:
Information management is a key aspect of successful construction projects. Having inaccurate measurements and conflicting data can lead to costly mistakes, and vague quantities can ruin estimates and schedules. Building information modeling (BIM) augments a 3D model with a wide variety of information, which reduces many sources of error and can detect conflicts before they occur. Because new technology is often more complex, it can be difficult to effectively integrate it with existing business practices. In this paper, we will answer two questions: How can BIM add value to construction projects? and What lessons can be learned from other companies that use BIM or other similar technology? Previous research focused on the technology as if it were simply a tool, observing problems that occurred while integrating new technology into existing practices. Our research instead looks at the flow of information through a company and its network, seeing all the actors as part of an ecosystem. Building upon this idea, we proposed the metaphor of an information supply chain to illustrate how BIM can add value to a construction project. This paper then concludes with two case studies. The first case study illustrates a failure in the flow of information that could have prevented by using BIM. The second case study profiles a leading design firm that has used BIM products for many years and shows the real benefits of using this program.
DESIGN AND IMPLEMENT DYNAMIC PROGRAMMING BASED DISCRETE POWER LEVEL SMART HOME SCHEDULING USING FPGA
Resumo:
With the development and capabilities of the Smart Home system, people today are entering an era in which household appliances are no longer just controlled by people, but also operated by a Smart System. This results in a more efficient, convenient, comfortable, and environmentally friendly living environment. A critical part of the Smart Home system is Home Automation, which means that there is a Micro-Controller Unit (MCU) to control all the household appliances and schedule their operating times. This reduces electricity bills by shifting amounts of power consumption from the on-peak hour consumption to the off-peak hour consumption, in terms of different “hour price”. In this paper, we propose an algorithm for scheduling multi-user power consumption and implement it on an FPGA board, using it as the MCU. This algorithm for discrete power level tasks scheduling is based on dynamic programming, which could find a scheduling solution close to the optimal one. We chose FPGA as our system’s controller because FPGA has low complexity, parallel processing capability, a large amount of I/O interface for further development and is programmable on both software and hardware. In conclusion, it costs little time running on FPGA board and the solution obtained is good enough for the consumers.
Resumo:
The widespread of low cost embedded electronics makes it easier to implement the smart devices that can understand either the environment or the user behaviors. The main object of this project is to design and implement home use portable smart electronics, including the portable monitoring device for home and office security and the portable 3D mouse for convenient use. Both devices in this project use the MPU6050 which contains a 3 axis accelerometer and a 3 axis gyroscope to sense the inertial motion of the door or the human hands movement. For the portable monitoring device for home and office security, MPU6050 is used to sense the door (either home front door or cabinet door) movement through the gyroscope, and Raspberry Pi is then used to process the data it receives from MPU6050, if the data value exceeds the preset threshold, Raspberry Pi would control the USB Webcam to take a picture and then send out an alert email with the picture to the user. The advantage of this device is that it is a small size portable stand-alone device with its own power source, it is easy to implement, really cheap for residential use, and energy efficient with instantaneous alert. For the 3D mouse, the MPU6050 would use both the accelerometer and gyroscope to sense user hands movement, the data are processed by MSP430G2553 through a digital smooth filter and a complementary filter, and then the filtered data will pass to the personal computer through the serial COM port. By applying the cursor movement equation in the PC driver, this device can work great as a mouse with acceptable accuracy. Compared to the normal optical mouse we are using, this mouse does not need any working surface, with the use of the smooth and complementary filter, it has certain accuracy for normal use, and it is easy to be extended to a portable mouse as small as a finger ring.
Resumo:
Gene-directed enzyme prodrug therapy is a form of cancer therapy in which delivery of a gene that encodes an enzyme is able to convert a prodrug, a pharmacologically inactive molecule, into a potent cytotoxin. Currently delivery of gene and prodrug is a two-step process. Here, we propose a one-step method using polymer nanocarriers to deliver prodrug, gene and cytotoxic drug simultaneously to malignant cells. Prodrugs acyclovir, ganciclovir and 5-doxifluridine were used to directly to initiate ring-opening polymerization of epsilon-caprolactone, forming a hydrophobic prodrug-tagged poly(epsilon-caprolactone) which was further grafted with hydrophilic polymers (methoxy poly(ethylene glycol), chitosan or polyethylenemine) to form amphiphilic copolymers for micelle formation. Successful synthesis of copolymers and micelle formation was confirmed by standard analytical means. Conversion of prodrugs to their cytotoxic forms was analyzed by both two-step and one-step means i.e. by first delivering gene plasmid into cell line HT29 and then challenging the cells with the prodrug-tagged micelle carriers and secondly by complexing gene plasmid onto micelle nanocarriers and delivery gene and prodrug simultaneously to parental HT29 cells. Anticancer effectiveness of prodrug-tagged micelles was further enhanced by encapsulating chemotherapy drugs doxorubicin or SN-38. Viability of colon cancer cell line HT29 was significantly reduced. Furthermore, in an effort to develop a stealth and targeted carrier, CD47-streptavidin fusion protein was attached onto the micelle surface utilizing biotin-streptavidin affinity. CD47, a marker of self on the red blood cell surface, was used for its antiphagocytic efficacy, results showed that micelles bound with CD47 showed antiphagocytic efficacy when exposed to J774A.1 macrophages. Since CD47 is not only an antiphagocytic ligand but also an integrin associated protein, it was used to target integrin alpha(v)beta(3), which is overexpressed on tumor-activated neovascular endothelial cells. Results showed that CD47-tagged micelles had enhanced uptake when treated to PC3 cells which have high expression of alpha(v)beta(3). The synthesized multifunctional polymeric micelle carriers developed could offer a new platform for an innovative cancer therapy regime.
Resumo:
Projects in the area of architectural design and urban planning typically engage several architects as well as experts from other professions. While the design and review meetings thus often involve a large number of cooperating participants, the actual design is still done by the individuals in the time in between those meetings using desktop PCs and CAD applications. A real collaborative approach to architectural design and urban planning is often limited to early paper-based sketches.In order to overcome these limitations, we designed and realized the ARTHUR system, an Augmented Reality (AR) enhanced round table to support complex design and planning decisions for architects. WhileAR has been applied to this area earlier, our approach does not try to replace the use of CAD systems but rather integrates them seamlessly into the collaborative AR environment. The approach is enhanced by intuitiveinteraction mechanisms that can be easily con-figured for different application scenarios.
Resumo:
Using stress and coping as a unifying theoretical concept, a series of five models was developed in order to synthesize the survey questions and to classify information. These models identified the question, listed the research study, described measurements, listed workplace data, and listed industry and national reference data.^ A set of 38 instrument questions was developed within the five coping correlate categories. In addition, a set of 22 stress symptoms was also developed. The study was conducted within two groups, police and professors, on a large university campus. The groups were selected because their occupations were diverse, but they were a part of the same macroenvironment. The premise was that police officers would be more highly stressed than professors.^ Of a total study group of 80, there were 37 respondents. The difference in the mean stress responses was observable between the two groups. Not only were the responses similar within each group, but the stress level of response was also similar within each group. While the response to the survey instrument was good, only 3 respondents answered the stress symptom survey properly. It was determined that none of the 37 respondents believed that they were ill. This perception of being well was also evidenced by the grand mean of the stress scores of 2.76 (3.0 = moderate stress). This also caused fewer independent variables to be entered in the multiple regression model.^ The survey instrument was carefully designed to be universal. Universality is the ability to transcend occupational or regional definitions as applied to stress. It is the ability to measure responses within broad categories such as physiological, emotional, behavioral, social, and cognitive functions without losing the ability to measure the detail within the individual questions, or the relationships between questions and categories.^ Replication is much easier to achieve with standardized categories, questions, and measurement procedures such as those developed for the universal survey instrument. Because the survey instrument is universal it can be used as an analytical device, an assessment device, a basic tool for planning and a follow-up instrument to measure individual response to planned reductions in occupational stress. (Abstract shortened with permission of author.) ^
Resumo:
Most medical implants run on batteries, which require costly and tedious replacement or recharging. It is believed that micro-generators utilizing intracorporeal energy could solve these problems. However, such generators do not, at this time, meet the energy requirements of medical implants.This paper highlights some essential aspects of designing and implementing a power source that scavenges energy from arterial expansion and contraction to operate an implanted medical device. After evaluating various potentially viable transduction mechanisms, the fabricated prototype employs an electromagnetic transduction mechanism. The artery is inserted into a laboratory-fabricated flexible coil which is permitted to freely deform in a magnetic field. This work also investigates the effects of the arterial wall's material properties on energy harvesting potential. For that purpose, two types of arteries (Penrose X-ray tube, which behave elastically, and an artery of a Göttinger minipig, which behaves viscoelastically) were tested. No noticeable difference could be observed between these two cases. For the pig artery, average harvestable power was 42 nW. Moreover, peak power was 2.38 μW. Both values are higher than those of the current state of the art (6 nW/16 nW). A theoretical modelling of the prototype was developed and compared to the experimental results.
Resumo:
INTRODUCTION Even though arthroplasty of the ankle joint is considered to be an established procedure, only about 1,300 endoprostheses are implanted in Germany annually. Arthrodeses of the ankle joint are performed almost three times more often. This may be due to the availability of the procedure - more than twice as many providers perform arthrodesis - as well as the postulated high frequency of revision procedures of arthroplasties in the literature. In those publications, however, there is often no clear differentiation between revision surgery with exchange of components, subsequent interventions due to complications and subsequent surgery not associated with complications. The German Orthopaedic Foot and Ankle Association's (D. A. F.) registry for total ankle replacement collects data pertaining to perioperative complications as well as cause, nature and extent of the subsequent interventions, and postoperative patient satisfaction. MATERIAL AND METHODS The D. A. F.'s total ankle replacement register is a nation-wide, voluntary registry. After giving written informed consent, the patients can be added to the database by participating providers. Data are collected during hospital stay for surgical treatment, during routine follow-up inspections and in the context of revision surgery. The information can be submitted in paper-based or online formats. The survey instruments are available as minimum data sets or scientific questionnaires which include patient-reported outcome measures (PROMs). The pseudonymous clinical data are collected and evaluated at the Institute for Evaluative Research in Medicine, University of Bern/Switzerland (IEFM). The patient-related data remain on the register's module server in North Rhine-Westphalia, Germany. The registry's methodology as well as the results of the revisions and patient satisfaction for 115 patients with a two year follow-up period are presented. Statistical analyses are performed with SAS™ (Version 9.4, SAS Institute, Inc., Cary, NC, USA). RESULTS About 2½ years after the register was launched there are 621 datasets on primary implantations, 1,427 on follow-ups and 121 records on re-operation available. 49 % of the patients received their implants due to post-traumatic osteoarthritis, 27 % because of a primary osteoarthritis and 15 % of patients suffered from a rheumatic disease. More than 90 % of the primary interventions proceeded without complications. Subsequent interventions were recorded for 84 patients, which corresponds to a rate of 13.5 % with respect to the primary implantations. It should be noted that these secondary procedures also include two-stage procedures not due to a complication. "True revisions" are interventions with exchange of components due to mechanical complications and/or infection and were present in 7.6 % of patients. 415 of the patients commented on their satisfaction with the operative result during the last follow-up: 89.9 % of patients evaluate their outcome as excellent or good, 9.4 % as moderate and only 0.7 % (3 patients) as poor. In these three cases a component loosening or symptomatic USG osteoarthritis was present. Two-year follow-up data using the American Orthopedic Foot and Ankle Society Ankle and Hindfoot Scale (AOFAS-AHS) are already available for 115 patients. The median AOFAS-AHS score increased from 33 points preoperatively to more than 80 points three to six months postoperatively. This increase remained nearly constant over the entire two-year follow-up period. CONCLUSION Covering less than 10 % of the approximately 240 providers in Germany and approximately 12 % of the annually implanted total ankle-replacements, the D. A. F.-register is still far from being seen as a national registry. Nevertheless, geographical coverage and inclusion of "high-" (more than 100 total ankle replacements a year) and "low-volume surgeons" (less than 5 total ankle replacements a year) make the register representative for Germany. The registry data show that the number of subsequent interventions and in particular the "true revision" procedures are markedly lower than the 20 % often postulated in the literature. In addition, a high level of patient satisfaction over the short and medium term is recorded. From the perspective of the authors, these results indicate that total ankle arthroplasty - given a correct indication and appropriate selection of patients - is not inferior to an ankle arthrodesis concerning patients' satisfaction and function. First valid survival rates can be expected about 10 years after the register's start.
Resumo:
Our research goals are focused on the preparation of novel molecule-based materials that possess specifically designed properties in solution or in the solid state e.g. self-assembly, magnetism, conductivity and spin crossover phenomena. Most of our systems incorporate paramagnetic transition metal ions and the search for new molecule-based magnetic materials is a prominent theme. Specific areas of research include the preparation and study of oxalate based 2D and 3D magnets, probing the versatility of octacyanometalate building blocks as precursors for new molecular magnets, and the preparation of new tetrathiafulvalene (TIF) derivatives for applications in molecular and supramolecular chemistry.
Resumo:
XENON is a dark matter direct detection project, consisting of a time projection chamber (TPC) filled with liquid xenon as detection medium. The construction of the next generation detector, XENON1T, is presently taking place at the Laboratori Nazionali del Gran Sasso (LNGS) in Italy. It aims at a sensitivity to spin-independent cross sections of 2 10-47 c 2 for WIMP masses around 50 GeV2, which requires a background reduction by two orders of magnitude compared to XENON100, the current generation detector. An active system that is able to tag muons and muon-induced backgrounds is critical for this goal. A water Cherenkov detector of ~ 10 m height and diameter has been therefore developed, equipped with 8 inch photomultipliers and cladded by a reflective foil. We present the design and optimization study for this detector, which has been carried out with a series of Monte Carlo simulations. The muon veto will reach very high detection efficiencies for muons (>99.5%) and showers of secondary particles from muon interactions in the rock (>70%). Similar efficiencies will be obtained for XENONnT, the upgrade of XENON1T, which will later improve the WIMP sensitivity by another order of magnitude. With the Cherenkov water shield studied here, the background from muon-induced neutrons in XENON1T is negligible.
Resumo:
Cleverly designed molecular building blocks provide chemists with the tools of a powerful molecular-scale construction set. They enable them to engineer materials having a predictable order and useful solid-state properties. Hence, it is in the realm of supramolecular chemistry to follow a strategy for synthesizing materials which combine a selected set of properties, for instance from the areas of magnetism, photophysics and electronics. As a successful approach, host/guest solids which are based on extended anionic, homo- and bimetallic oxalato-bridged transition-metal compounds with two-and three-dimensional connectivities have been investigated. In this report, a brief review is given on the structural aspects of this class of compounds followed by a presentation of a thermal and magnetic study for two distinct, heterometallic oxalato-bridged layer compounds.
Resumo:
The influence of respiratory motion on patient anatomy poses a challenge to accurate radiation therapy, especially in lung cancer treatment. Modern radiation therapy planning uses models of tumor respiratory motion to account for target motion in targeting. The tumor motion model can be verified on a per-treatment session basis with four-dimensional cone-beam computed tomography (4D-CBCT), which acquires an image set of the dynamic target throughout the respiratory cycle during the therapy session. 4D-CBCT is undersampled if the scan time is too short. However, short scan time is desirable in clinical practice to reduce patient setup time. This dissertation presents the design and optimization of 4D-CBCT to reduce the impact of undersampling artifacts with short scan times. This work measures the impact of undersampling artifacts on the accuracy of target motion measurement under different sampling conditions and for various object sizes and motions. The results provide a minimum scan time such that the target tracking error is less than a specified tolerance. This work also presents new image reconstruction algorithms for reducing undersampling artifacts in undersampled datasets by taking advantage of the assumption that the relevant motion of interest is contained within a volume-of-interest (VOI). It is shown that the VOI-based reconstruction provides more accurate image intensity than standard reconstruction. The VOI-based reconstruction produced 43% fewer least-squares error inside the VOI and 84% fewer error throughout the image in a study designed to simulate target motion. The VOI-based reconstruction approach can reduce acquisition time and improve image quality in 4D-CBCT.
Resumo:
Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.