9 resultados para Efficient process
em Digital Commons at Florida International University
Resumo:
Three-Dimensional (3-D) imaging is vital in computer-assisted surgical planning including minimal invasive surgery, targeted drug delivery, and tumor resection. Selective Internal Radiation Therapy (SIRT) is a liver directed radiation therapy for the treatment of liver cancer. Accurate calculation of anatomical liver and tumor volumes are essential for the determination of the tumor to normal liver ratio and for the calculation of the dose of Y-90 microspheres that will result in high concentration of the radiation in the tumor region as compared to nearby healthy tissue. Present manual techniques for segmentation of the liver from Computed Tomography (CT) tend to be tedious and greatly dependent on the skill of the technician/doctor performing the task. ^ This dissertation presents the development and implementation of a fully integrated algorithm for 3-D liver and tumor segmentation from tri-phase CT that yield highly accurate estimations of the respective volumes of the liver and tumor(s). The algorithm as designed requires minimal human intervention without compromising the accuracy of the segmentation results. Embedded within this algorithm is an effective method for extracting blood vessels that feed the tumor(s) in order to plan effectively the appropriate treatment. ^ Segmentation of the liver led to an accuracy in excess of 95% in estimating liver volumes in 20 datasets in comparison to the manual gold standard volumes. In a similar comparison, tumor segmentation exhibited an accuracy of 86% in estimating tumor(s) volume(s). Qualitative results of the blood vessel segmentation algorithm demonstrated the effectiveness of the algorithm in extracting and rendering the vasculature structure of the liver. Results of the parallel computing process, using a single workstation, showed a 78% gain. Also, statistical analysis carried out to determine if the manual initialization has any impact on the accuracy showed user initialization independence in the results. ^ The dissertation thus provides a complete 3-D solution towards liver cancer treatment planning with the opportunity to extract, visualize and quantify the needed statistics for liver cancer treatment. Since SIRT requires highly accurate calculation of the liver and tumor volumes, this new method provides an effective and computationally efficient process required of such challenging clinical requirements.^
Resumo:
Hazardous radioactive liquid waste is the legacy of more than 50 years of plutonium production associated with the United States' nuclear weapons program. It is estimated that more than 245,000 tons of nitrate wastes are stored at facilities such as the single-shell tanks (SST) at the Hanford Site in the state of Washington, and the Melton Valley storage tanks at Oak Ridge National Laboratory (ORNL) in Tennessee. In order to develop an innovative, new technology for the destruction and immobilization of nitrate-based radioactive liquid waste, the United State Department of Energy (DOE) initiated the research project which resulted in the technology known as the Nitrate to Ammonia and Ceramic (NAC) process. However, inasmuch as the nitrate anion is highly mobile and difficult to immobilize, especially in relatively porous cement-based grout which has been used to date as a method for the immobilization of liquid waste, it presents a major obstacle to environmental clean-up initiatives. Thus, in an effort to contribute to the existing body of knowledge and enhance the efficacy of the NAC process, this research involved the experimental measurement of the rheological and heat transfer behaviors of the NAC product slurry and the determination of the optimal operating parameters for the continuous NAC chemical reaction process. Test results indicate that the NAC product slurry exhibits a typical non-Newtonian flow behavior. Correlation equations for the slurry's rheological properties and heat transfer rate in a pipe flow have been developed; these should prove valuable in the design of a full-scale NAC processing plant. The 20-percent slurry exhibited a typical dilatant (shear thickening) behavior and was in the turbulent flow regime due to its lower viscosity. The 40-percent slurry exhibited a typical pseudoplastic (shear thinning) behavior and remained in the laminar flow regime throughout its experimental range. The reactions were found to be more efficient in the lower temperature range investigated. With respect to leachability, the experimental final NAC ceramic waste form is comparable to the final product of vitrification, the technology chosen by DOE to treat these wastes. As the NAC process has the potential of reducing the volume of nitrate-based radioactive liquid waste by as much as 70 percent, it not only promises to enhance environmental remediation efforts but also effect substantial cost savings. ^
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. ^ Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. ^ This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model’s parsing mechanism. ^ The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents. ^
Resumo:
Buffered crossbar switches have recently attracted considerable attention as the next generation of high speed interconnects. They are a special type of crossbar switches with an exclusive buffer at each crosspoint of the crossbar. They demonstrate unique advantages over traditional unbuffered crossbar switches, such as high throughput, low latency, and asynchronous packet scheduling. However, since crosspoint buffers are expensive on-chip memories, it is desired that each crosspoint has only a small buffer. This dissertation proposes a series of practical algorithms and techniques for efficient packet scheduling for buffered crossbar switches. To reduce the hardware cost of such switches and make them scalable, we considered partially buffered crossbars, whose crosspoint buffers can be of an arbitrarily small size. Firstly, we introduced a hybrid scheme called Packet-mode Asynchronous Scheduling Algorithm (PASA) to schedule best effort traffic. PASA combines the features of both distributed and centralized scheduling algorithms and can directly handle variable length packets without Segmentation And Reassembly (SAR). We showed by theoretical analysis that it achieves 100% throughput for any admissible traffic in a crossbar with a speedup of two. Moreover, outputs in PASA have a large probability to avoid the more time-consuming centralized scheduling process, and thus make fast scheduling decisions. Secondly, we proposed the Fair Asynchronous Segment Scheduling (FASS) algorithm to handle guaranteed performance traffic with explicit flow rates. FASS reduces the crosspoint buffer size by dividing packets into shorter segments before transmission. It also provides tight constant performance guarantees by emulating the ideal Generalized Processor Sharing (GPS) model. Furthermore, FASS requires no speedup for the crossbar, lowering the hardware cost and improving the switch capacity. Thirdly, we presented a bandwidth allocation scheme called Queue Length Proportional (QLP) to apply FASS to best effort traffic. QLP dynamically obtains a feasible bandwidth allocation matrix based on the queue length information, and thus assists the crossbar switch to be more work-conserving. The feasibility and stability of QLP were proved, no matter whether the traffic distribution is uniform or non-uniform. Hence, based on bandwidth allocation of QLP, FASS can also achieve 100% throughput for best effort traffic in a crossbar without speedup.
Resumo:
A commonly held view is that creation of excessive domestic credit may lead to inflation problems, however, many economists uphold the possibility that, generous domestic credit under appropriate conditions will result in increases of output. This hypothesis is examined for Japan and Colombia for the period 1950-1993.^ Domestic credit theories are reviewed since the times of Thornton and Smith, until the recent times of Lewis, McKinnon, Stiglitz and of Japanese economists like K. Emi, Tachi R. and others. It is found that in Japan of the Post-War period, efficient financial markets and the decisive role of the government in orienting investment decisions seem to have influenced positively the effectiveness of domestic credit as an output-stimulating variable. On the contrary, in Colombia the absence of the above features seems to explain why domestic credit is not very effective as an output-stimulating variable.^ Multiple regression analyses show that domestic credit is a strong explanatory variable for output increases in Japan and a weak one for Colombia's case in the studied period. For Japan the correlation depicts a positive relationship between the two variables with a decreasing rate very similar to a typical production function. Moreover, the positive decreasing rate is confirmed if net domestic credit is used in the correlations. For Colombia a positive relationship is also found when accumulated domestic credit is used, but, if net domestic credit is the source of correlations, the positive decreasing rate is not obtained.^ Granger causality tests determined causality from domestic credit to output for Japan and no-causality for Colombia at the 1% significance level; the differences are explained by: (1) The low development level of the financial system in Colombia. (2) The nonexistence of consistent domestic credit policy to foster economic development. (3) The lack of an authoritative orientation in the allocation of financial resources and the nonexistence of long range industrialization programs in Colombia that could channel productively credit resources. For the system of equations relating domestic credit and exports, the Granger causality tests determined no-causality between domestic credit and exports for both Japan and Colombia also at the 1% significance level. ^
Resumo:
The study of the private management of public housing is an important topic to be critically analyzed as the government search for ways to increase efficiency in providing housing for the poor. Public Housing Authorities must address the cost for repairing or replacing the deteriorating housing stock, the increase in the need for affordable housing, and the lack of supply. There is growing pressure on efficient use of public funds that has heightened the need for profound structural reform. An important strategy for carrying out such reform is through privatization. Although privatization does not work in every case, the majority position in the traditional privatization literature is that reliance on private organizations normally, but not always, results in cost savings. ^ The primary purpose of this dissertation is to determine whether a consensus exist among decision-makers on the efficiency of privatizing the management of public housing. A secondary purpose is to review the techniques (best practices) used by the private sector that results in cost-efficiencies in the management of public housing. The study employs the use of a triangulated research design utilizing cross-sectional survey methodology that included use of a survey instrument to solicit responses from the private managers. The study consists of qualitative methods using interviews from key informants of private-sector management firms and public housing agencies, case studies, focus groups, archival records and housing authorities documents. ^ Results indicated that important decision-makers perceive that private managers made a positive contribution to cost-efficiencies in the management of public housing. The performance of private contractors served as a yardstick for comparison of efficiency of services that are produced in-house. The study concluded that private managers made the benefits of their management techniques well known creating a sense of competition between public and private managers. Competition from private contractors spurred municipal worker and management productivity improvements creating better management results for the public housing authorities. The study results are in concert with a review of recent research and studies that also concluded private managers have some distinct advantages to controlling costs in the management of public housing. ^
Resumo:
With the developments in computing and communication technologies, wireless sensor networks have become popular in wide range of application areas such as health, military, environment and habitant monitoring. Moreover, wireless acoustic sensor networks have been widely used for target tracking applications due to their passive nature, reliability and low cost. Traditionally, acoustic sensor arrays built in linear, circular or other regular shapes are used for tracking acoustic sources. The maintaining of relative geometry of the acoustic sensors in the array is vital for accurate target tracking, which greatly reduces the flexibility of the sensor network. To overcome this limitation, we propose using only a single acoustic sensor at each sensor node. This design greatly improves the flexibility of the sensor network and makes it possible to deploy the sensor network in remote or hostile regions through air-drop or other stealth approaches. Acoustic arrays are capable of performing the target localization or generating the bearing estimations on their own. However, with only a single acoustic sensor, the sensor nodes will not be able to generate such measurements. Thus, self-organization of sensor nodes into virtual arrays to perform the target localization is essential. We developed an energy-efficient and distributed self-organization algorithm for target tracking using wireless acoustic sensor networks. The major error sources of the localization process were studied, and an energy-aware node selection criterion was developed to minimize the target localization errors. Using this node selection criterion, the self-organization algorithm selects a near-optimal localization sensor group to minimize the target tracking errors. In addition, a message passing protocol was developed to implement the self-organization algorithm in a distributed manner. In order to achieve extended sensor network lifetime, energy conservation was incorporated into the self-organization algorithm by incorporating a sleep-wakeup management mechanism with a novel cross layer adaptive wakeup probability adjustment scheme. The simulation results confirm that the developed self-organization algorithm provides satisfactory target tracking performance. Moreover, the energy saving analysis confirms the effectiveness of the cross layer power management scheme in achieving extended sensor network lifetime without degrading the target tracking performance.
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.
Resumo:
With the developments in computing and communication technologies, wireless sensor networks have become popular in wide range of application areas such as health, military, environment and habitant monitoring. Moreover, wireless acoustic sensor networks have been widely used for target tracking applications due to their passive nature, reliability and low cost. Traditionally, acoustic sensor arrays built in linear, circular or other regular shapes are used for tracking acoustic sources. The maintaining of relative geometry of the acoustic sensors in the array is vital for accurate target tracking, which greatly reduces the flexibility of the sensor network. To overcome this limitation, we propose using only a single acoustic sensor at each sensor node. This design greatly improves the flexibility of the sensor network and makes it possible to deploy the sensor network in remote or hostile regions through air-drop or other stealth approaches. Acoustic arrays are capable of performing the target localization or generating the bearing estimations on their own. However, with only a single acoustic sensor, the sensor nodes will not be able to generate such measurements. Thus, self-organization of sensor nodes into virtual arrays to perform the target localization is essential. We developed an energy-efficient and distributed self-organization algorithm for target tracking using wireless acoustic sensor networks. The major error sources of the localization process were studied, and an energy-aware node selection criterion was developed to minimize the target localization errors. Using this node selection criterion, the self-organization algorithm selects a near-optimal localization sensor group to minimize the target tracking errors. In addition, a message passing protocol was developed to implement the self-organization algorithm in a distributed manner. In order to achieve extended sensor network lifetime, energy conservation was incorporated into the self-organization algorithm by incorporating a sleep-wakeup management mechanism with a novel cross layer adaptive wakeup probability adjustment scheme. The simulation results confirm that the developed self-organization algorithm provides satisfactory target tracking performance. Moreover, the energy saving analysis confirms the effectiveness of the cross layer power management scheme in achieving extended sensor network lifetime without degrading the target tracking performance.