965 resultados para PC-TARE (Computer program)
Resumo:
This report is the product of a first-year research project in the University Transportation Centers Program. This project was carried out by an interdisciplinary research team at The University of Iowa's Public Policy Center. The project developed a computerized system to support decisions on locating facilities that serve rural areas while minimizing transportation costs. The system integrates transportation databases with algorithms that specify efficient locations and allocate demand efficiently to service regions; the results of these algorithms are used interactively by decision makers. The authors developed documentation for the system so that others could apply it to estimate the transportation and route requirements of alternative locations and identify locations that meet certain criteria with the least cost. The system was developed and tested on two transportation-related problems in Iowa, and this report uses these applications to illustrate how the system can be used.
Resumo:
The Property and Equipment Department has a central supply of automotive parts, tools, and maintenance supplies. This central supply is used to supply the repair shop and also to supply parts to the various field garages and all departments of the Commission. The old procedure involved keeping track manually of all of the parts, which involved some 22,000 items. All records, billings, arid re-order points were kept manually. Mani times the re-order points were located by reaching into a bin and finding nothing there. Desiring to improve this situation, an inventory control system was established for use on the computer. A complete record of the supplies that are stored in the central warehouse was prepared and this information was used to make a catalog. Each time an item is issued or received, it is processed through the inventory program. When the re-order point is reached, a notice is given to reorder. The procedure for taking inventory has been improved. A voucher invoice is now prepared by the computer for all issues to departments. These are some of the many benefits that have been de rived from this system.
Resumo:
Abstract : Since at least the 1980's, a growing number of companies have set up an ethics or a compliance program within their organization. However, in the field of study of business management, there is a paucity of research studies concerning these management systems. This observation warranted the present investigation of one company's compliance program. Compliance programs are set up so that individuals working within an organization observe the laws and regulations which pertain to their work. This study used a constructivist grounded theory methodology to examine the process by which a specific compliance program, that of Siemens Canada Limited, was implemented throughout its organization. In conformity with this methodology, instead of proceeding with the investigation in accordance to a particular theoretical framework, the study established a number of theoretical constructs used strictly as reference points. The study's research question was stated as: what are the characteristics of the process by which Siemens' compliance program integrated itself into the existing organizational structure and gained employee acceptance? Data consisted of documents produced by the company and of interviews done with twenty-four managers working for Siemens Canada Limited. The researcher used QSR-Nvivo computer assisted software to code transcripts and to help with analyzing interviews and documents. Triangulation was done by using a number of analysis techniques and by constantly comparing findings with extant theory. A descriptive model of the implementation process grounded in the experience of participants and in the contents of the documents emerged from the data. The process was called "Remolding"; remolding being the core category having emerged. This main process consisted of two sub-processes identified as "embedding" and "appraising." The investigation was able to provide a detailed account of the appraising process. It identified that employees appraised the compliance program according to three facets: the impact of the program on the employee's daily activities, the relationship employees have with the local compliance organization, and the relationship employees have with the corporate ethics identity. The study suggests that a company who is entertaining the idea of implementing a compliance program should consider all three facets. In particular, it suggests that any company interested in designing and implementing a compliance program should pay particular attention to its corporate ethics identity. This is because employee's acceptance of the program is influenced by their comparison of the company's ethics identity to their local ethics identity. Implications of the study suggest that personnel responsible for the development and organizational support of a compliance program should understand the appraisal process by which employees build their relationship with the program. The originality of this study is that it points emphatically that companies must pay special attention in developing a corporate ethics identify which is coherent, well documented and well explained.
Resumo:
Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.
Resumo:
The protein folding problem has been one of the most challenging subjects in biological physics due to its complexity. Energy landscape theory based on statistical mechanics provides a thermodynamic interpretation of the protein folding process. We have been working to answer fundamental questions about protein-protein and protein-water interactions, which are very important for describing the energy landscape surface of proteins correctly. At first, we present a new method for computing protein-protein interaction potentials of solvated proteins directly from SAXS data. An ensemble of proteins was modeled by Metropolis Monte Carlo and Molecular Dynamics simulations, and the global X-ray scattering of the whole model ensemble was computed at each snapshot of the simulation. The interaction potential model was optimized and iterated by a Levenberg-Marquardt algorithm. Secondly, we report that terahertz spectroscopy directly probes hydration dynamics around proteins and determines the size of the dynamical hydration shell. We also present the sequence and pH-dependence of the hydration shell and the effect of the hydrophobicity. On the other hand, kinetic terahertz absorption (KITA) spectroscopy is introduced to study the refolding kinetics of ubiquitin and its mutants. KITA results are compared to small angle X-ray scattering, tryptophan fluorescence, and circular dichroism results. We propose that KITA monitors the rearrangement of hydrogen bonding during secondary structure formation. Finally, we present development of the automated single molecule operating system (ASMOS) for a high throughput single molecule detector, which levitates a single protein molecule in a 10 µm diameter droplet by the laser guidance. I also have performed supporting calculations and simulations with my own program codes.
Resumo:
This article introduces a theoretical framework for the analysis of the player character (PC) in offline computer role-playing games (cRPGs). It derives from the assumption that the character constitutes the focal point of the game, around which all the other elements revolve. This underlying observation became the foundation of the Player Character Grid and its constituent Pivot Player Character Model, a conceptual framework illustrating the experience of gameplay as perceived through the PC’s eyes. Although video game characters have been scrutinised from many different perspectives, a systematic framework has not been introduced yet. This study aims to fill that void by proposing a model replicable across the cRPG genre. It has been largely inspired by Anne Ubersfeld’s semiological dramatic character research implemented in Reading Theatre I (1999) and is demonstrated with reference to The Witcher (CD Projekt RED 2007).
Resumo:
The exponential increase in clinical research has profoundly changed medical sciences. Evidence that has accumulated in the past three decades from clinical trials has led to the proposal that clinical care should not be based solely on clinical expertise and patient values, and should integrate robust data from systematic research. As a consequence, clinical research has become more complex and methods have become more rigorous, and evidence is usually not easily translated into clinical practice. Therefore, the instruction of clinical research methods for scientists and clinicians must adapt to this new reality. To address this challenge, a global distance-learning clinical research-training program was developed, based on collaborative learning, the pedagogical goal of which was to develop critical thinking skills in clinical research. We describe and analyze the challenges and possible solutions of this course after 5 years of experience (2008-2012) with this program. Through evaluation by students and faculty, we identified and reviewed the following challenges of our program: 1) student engagement and motivation, 2) impact of heterogeneous audience on learning, 3) learning in large groups, 4) enhancing group learning, 5) enhancing social presence, 6) dropouts, 7) quality control, and 8) course management. We discuss these issues and potential alternatives with regard to our research and background.
Resumo:
The big data era has dramatically transformed our lives; however, security incidents such as data breaches can put sensitive data (e.g. photos, identities, genomes) at risk. To protect users' data privacy, there is a growing interest in building secure cloud computing systems, which keep sensitive data inputs hidden, even from computation providers. Conceptually, secure cloud computing systems leverage cryptographic techniques (e.g., secure multiparty computation) and trusted hardware (e.g. secure processors) to instantiate a “secure” abstract machine consisting of a CPU and encrypted memory, so that an adversary cannot learn information through either the computation within the CPU or the data in the memory. Unfortunately, evidence has shown that side channels (e.g. memory accesses, timing, and termination) in such a “secure” abstract machine may potentially leak highly sensitive information, including cryptographic keys that form the root of trust for the secure systems. This thesis broadly expands the investigation of a research direction called trace oblivious computation, where programming language techniques are employed to prevent side channel information leakage. We demonstrate the feasibility of trace oblivious computation, by formalizing and building several systems, including GhostRider, which is a hardware-software co-design to provide a hardware-based trace oblivious computing solution, SCVM, which is an automatic RAM-model secure computation system, and ObliVM, which is a programming framework to facilitate programmers to develop applications. All of these systems enjoy formal security guarantees while demonstrating a better performance than prior systems, by one to several orders of magnitude.
Resumo:
Part 1: Introduction
MINING AND VERIFICATION OF TEMPORAL EVENTS WITH APPLICATIONS IN COMPUTER MICRO-ARCHITECTURE RESEARCH
Resumo:
Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.
Resumo:
A purpose of this research study was to demonstrate the practical linguistic study and evaluation of dissertations by using two examples of the latest technology, the microcomputer and optical scanner. That involved developing efficient methods for data entry plus creating computer algorithms appropriate for personal, linguistic studies. The goal was to develop a prototype investigation which demonstrated practical solutions for maximizing the linguistic potential of the dissertation data base. The mode of text entry was from a Dest PC Scan 1000 Optical Scanner. The function of the optical scanner was to copy the complete stack of educational dissertations from the Florida Atlantic University Library into an I.B.M. XT microcomputer. The optical scanner demonstrated its practical value by copying 15,900 pages of dissertation text directly into the microcomputer. A total of 199 dissertations or 72% of the entire stack of education dissertations (277) were successfully copied into the microcomputer's word processor where each dissertation was analyzed for a variety of syntax frequencies. The results of the study demonstrated the practical use of the optical scanner for data entry, the microcomputer for data and statistical analysis, and the availability of the college library as a natural setting for text studies. A supplemental benefit was the establishment of a computerized dissertation corpus which could be used for future research and study. The final step was to build a linguistic model of the differences in dissertation writing styles by creating 7 factors from 55 dependent variables through principal components factor analysis. The 7 factors (textual components) were then named and described on a hypothetical construct defined as a continuum from a conversational, interactional style to a formal, academic writing style. The 7 factors were then grouped through discriminant analysis to create discriminant functions for each of the 7 independent variables. The results indicated that a conversational, interactional writing style was associated with more recent dissertations (1972-1987), an increase in author's age, females, and the department of Curriculum and Instruction. A formal, academic writing style was associated with older dissertations (1972-1987), younger authors, males, and the department of Administration and Supervision. It was concluded that there were no significant differences in writing style due to subject matter (community college studies) compared to other subject matter. It was also concluded that there were no significant differences in writing style due to the location of dissertation origin (Florida Atlantic University, University of Central Florida, Florida International University).
Resumo:
Currently, at the SC Commission for the Blind, there is no opportunity for computer training for adults in the Older Blind Program. The Older Blind Program has to look for outside partners to make this service viable again. This project proposes that the OB Program partner with regional senior and recreation centers to establish a community-based training program that is both effective and is of minimal cost to the agency and to the partnering centers.
Resumo:
To investigate the effects of a specific protocol of undulatory physical resistance training on maximal strength gains in elderly type 2 diabetics. The study included 48 subjects, aged between 60 and 85 years, of both genders. They were divided into two groups: Untrained Diabetic Elderly (n=19) with those who were not subjected to physical training and Trained Diabetic Elderly (n=29), with those who were subjected to undulatory physical resistance training. The participants were evaluated with several types of resistance training's equipment before and after training protocol, by test of one maximal repetition. The subjects were trained on undulatory resistance three times per week for a period of 16 weeks. The overload used in undulatory resistance training was equivalent to 50% of one maximal repetition and 70% of one maximal repetition, alternating weekly. Statistical analysis revealed significant differences (p<0.05) between pre-test and post-test over a period of 16 weeks. The average gains in strength were 43.20% (knee extension), 65.00% (knee flexion), 27.80% (supine sitting machine), 31.00% (rowing sitting), 43.90% (biceps pulley), and 21.10% (triceps pulley). Undulatory resistance training used with weekly different overloads was effective to provide significant gains in maximum strength in elderly type 2 diabetic individuals.
Resumo:
Ropivacaine (RVC) is an aminoamide local anesthetic widely used in surgical procedures. Studies with RVC encapsulated in liposomes and complexed in cyclodextrins have shown good results, but in order to use RVC for lengthy procedures and during the postoperative period, a still more prolonged anesthetic effect is required. This study therefore aimed to provide extended RVC release and increased upload using modified liposomes. Three types of vesicles were studied: (i) large multilamellar vesicle (LMV), (ii) large multivesicular vesicle (LMVV) and (iii) large unilamellar vesicle (LUV), prepared with egg phosphatidylcholine/cholesterol/α-tocopherol (4:3:0.07 mol%) at pH 7.4. Ionic gradient liposomes (inside: pH 5.5, pH 5.5 + (NH4)2SO4 and pH 7.4 + (NH4)2SO4) were prepared and showed improved RVC loading, compared to conventional liposomes (inside: pH 7.4). An high-performance liquid chromatography analytical method was validated for RVC quantification. The liposomes were characterized in terms of their size, zeta potential, polydispersion, morphology, RVC encapsulation efficiency (EE(%)) and in vitro RVC release. LMVV liposomes provided better performance than LMV or LUV. The best formulations were prepared using pH 5.5 (LMVV 5.5in) or pH 7.4 with 250 mM (NH4)2SO4 in the inner aqueous core (LMVV 7.4in + ammonium sulfate), enabling encapsulation of as much as 2% RVC, with high uptake (EE(%) ∼70%) and sustained release (∼25 h). The encapsulation of RVC in ionic gradient liposomes significantly extended the duration of release of the anesthetic, showing that this strategy could be a viable means of promoting longer-term anesthesia during surgical procedures and during the postoperative period.