6 resultados para pacs: equipment and software evaluation methods
em Digital Commons at Florida International University
Resumo:
This research examines evolving issues in applied computer science and applies economic and business analyses as well. There are two main areas. The first is internetwork communications as embodied by the Internet. The goal of the research is to devise an efficient pricing, prioritization, and incentivization plan that could be realistically implemented on the existing infrastructure. Criteria include practical and economic efficiency, and proper incentives for both users and providers. Background information on the evolution and functional operation of the Internet is given, and relevant literature is surveyed and analyzed. Economic analysis is performed on the incentive implications of the current pricing structure and organization. The problems are identified, and minimally disruptive solutions are proposed for all levels of implementation to the lowest level protocol. Practical issues are considered and performance analyses are done. The second area of research is mass market software engineering, and how this differs from classical software engineering. Software life-cycle revenues are analyzed and software pricing and timing implications are derived. A profit maximizing methodology is developed to select or defer the development of software features for inclusion in a given release. An iterative model of the stages of the software development process is developed, taking into account new communications capabilities as well as profitability. ^
Resumo:
Accurate knowledge of the time since death, or postmortem interval (PMI), has enormous legal, criminological, and psychological impact. In this study, an investigation was made to determine whether the relationship between the degradation of the human cardiac structure protein Cardiac Troponin T and PMI could be used as an indicator of time since death, thus providing a rapid, high resolution, sensitive, and automated methodology for the determination of PMI. ^ The use of Cardiac Troponin T (cTnT), a protein found in heart tissue, as a selective marker for cardiac muscle damage has shown great promise in the determination of PMI. An optimized conventional immunoassay method was developed to quantify intact and fragmented cTnT. A small sample of cardiac tissue, which is less affected than other tissues by external factors, was taken, homogenized, extracted with magnetic microparticles, separated by SDS-PAGE, and visualized with Western blot by probing with monoclonal antibody against cTnT. This step was followed by labeling and available scanners. This conventional immunoassay provides a proper detection and quantitation of cTnT protein in cardiac tissue as a complex matrix; however, this method does not provide the analyst with immediate results. Therefore, a competitive separation method using capillary electrophoresis with laser-induced fluorescence (CE-LIF) was developed to study the interaction between human cTnT protein and monoclonal anti-TroponinT antibody. ^ Analysis of the results revealed a linear relationship between the percent of degraded cTnT and the log of the PMI, indicating that intact cTnT could be detected in human heart tissue up to 10 days postmortem at room temperature and beyond two weeks at 4C. The data presented demonstrates that this technique can provide an extended time range during which PMI can be more accurately estimated as compared to currently used methods. The data demonstrates that this technique represents a major advance in time of death determination through a fast and reliable, semi-quantitative measurement of a biochemical marker from an organ protected from outside factors. ^
Resumo:
Drug targeting is an active area of research and nano-scaled drug delivery systems hold tremendous potential for the treatment of neoplasms. In this study, a novel cyclodextrin (CD)-based nanoparticle drug delivery system has been assembled and characterized for the therapy of folate receptor-positive [FR(+)] cancer. Water-soluble folic acid (FA)-conjugated CD carriers (FACDs) were successfully synthesized and their structures were confirmed by 1D/2D nuclear magnetic resonance (NMR), matrix-assisted laser desorption ionization time-of-flight mass spectrometer (MALDI-TOF-MS), high performance liquid chromatography (HPLC), Fourier transform infrared spectroscopy (FTIR), and circular dichroism. Drug complexes of adamatane (Ada) and cytotoxic doxorubicin (Dox) with FACD were readily obtained by mixed solvent precipitation. The average size of FACD-Ada-Dox was 1.5–2.5 nm. The host-guest association constant Ka was 1,639 M−1 as determined by induced circular dichroism and the hydrophilicity of the FACDs was greatly enhanced compared to unmodified CD. Cellular uptake and FR binding competitive experiments demonstrated an efficient and preferentially targeted delivery of Dox into FR-positive tumor cells and a sustained drug release profile was seen in vitro. The delivery of Dox into FR(+) cancer cells via endocytosis was observed by confocal microscopy and drug uptake of the targeted nanoparticles was 8-fold greater than that of non-targeted drug complexes. Our docking results suggest that FA, FACD and FACD-Ada-Dox could bind human hedgehog interacting protein that contains a FR domain. Mouse cardiomyocytes as well as fibroblast treated with FACD-Ada-Dox had significantly lower levels of reactive oxygen species, with increased content of glutathione and glutathione peroxidase activity, indicating a reduced potential for Dox-induced cardiotoxicity. These results indicate that the targeted drug complex possesses high drug association and sustained drug release properties with good biocompatibility and physiological stability. The novel FA-conjugated β-CD based drug complex might be promising as an anti-tumor treatment for FR(+) cancer.
Resumo:
As research into the dynamic characteristics of job performance across time has continued to accumulate, associated implications for performance appraisal have become evident. At present, several studies have demonstrated that systematic trends in job performance across time influence how performance is ultimately judged. However, little research has considered the processes by which the performance trend-performance rating relationship occurs. In the present study, I addressed this gap. Specifically, drawing on attribution theory, I proposed and tested a model whereby the performance trend-performance rating relationship occurs through attributions to ability and effort. The results of this study indicated that attributions to ability, but not effort, mediate the relationship between performance trend and performance ratings and that this relationship depends on attribution-related cues. Implications for performance appraisal research and theory are discussed.
Resumo:
Software architecture is the abstract design of a software system. It plays a key role as a bridge between requirements and implementation, and is a blueprint for development. The architecture represents a set of early design decisions that are crucial to a system. Mistakes in those decisions are very costly if they remain undetected until the system is implemented and deployed. This is where formal specification and analysis fits in. Formal specification makes sure that an architecture design is represented in a rigorous and unambiguous way. Furthermore, a formally specified model allows the use of different analysis techniques for verifying the correctness of those crucial design decisions. ^ This dissertation presented a framework, called SAM, for formal specification and analysis of software architectures. In terms of specification, formalisms and mechanisms were identified and chosen to specify software architecture based on different analysis needs. Formalisms for specifying properties were also explored, especially in the case of non-functional properties. In terms of analysis, the dissertation explored both the verification of functional properties and the evaluation of non-functional properties of software architecture. For the verification of functional property, methodologies were presented on how to apply existing model checking techniques on a SAM model. For the evaluation of non-functional properties, the dissertation first showed how to incorporate stochastic information into a SAM model, and then explained how to translate the model to existing tools and conducts the analysis using those tools. ^ To alleviate the analysis work, we also provided a tool to automatically translate a SAM model for model checking. All the techniques and methods described in the dissertation were illustrated by examples or case studies, which also served a purpose of advocating the use of formal methods in practice. ^
Resumo:
Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.