28 resultados para Design time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Catering to society's demand for high performance computing, billions of transistors are now integrated on IC chips to deliver unprecedented performances. With increasing transistor density, the power consumption/density is growing exponentially. The increasing power consumption directly translates to the high chip temperature, which not only raises the packaging/cooling costs, but also degrades the performance/reliability and life span of the computing systems. Moreover, high chip temperature also greatly increases the leakage power consumption, which is becoming more and more significant with the continuous scaling of the transistor size. As the semiconductor industry continues to evolve, power and thermal challenges have become the most critical challenges in the design of new generations of computing systems. ^ In this dissertation, we addressed the power/thermal issues from the system-level perspective. Specifically, we sought to employ real-time scheduling methods to optimize the power/thermal efficiency of the real-time computing systems, with leakage/ temperature dependency taken into consideration. In our research, we first explored the fundamental principles on how to employ dynamic voltage scaling (DVS) techniques to reduce the peak operating temperature when running a real-time application on a single core platform. We further proposed a novel real-time scheduling method, “M-Oscillations” to reduce the peak temperature when scheduling a hard real-time periodic task set. We also developed three checking methods to guarantee the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research from single core platform to multi-core platform. We investigated the energy estimation problem on the multi-core platforms and developed a light weight and accurate method to calculate the energy consumption for a given voltage schedule on a multi-core platform. Finally, we concluded the dissertation with elaborated discussions of future extensions of our research. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces the design of a multimodal, adaptive real-time assistive system as an alternate human computer interface that can be used by individuals with severe motor disabilities. The proposed design is based on the integration of a remote eye-gaze tracking system, voice recognition software, and a virtual keyboard. The methodology relies on a user profile that customizes eye gaze tracking using neural networks. The user profiling feature facilitates the notion of universal access to computing resources for a wide range of applications such as web browsing, email, word processing and editing. ^ The study is significant in terms of the integration of key algorithms to yield an adaptable and multimodal interface. The contributions of this dissertation stem from the following accomplishments: (a) establishment of the data transport mechanism between the eye-gaze system and the host computer yielding to a significantly low failure rate of 0.9%; (b) accurate translation of eye data into cursor movement through congregate steps which conclude with calibrated cursor coordinates using an improved conversion function; resulting in an average reduction of 70% of the disparity between the point of gaze and the actual position of the mouse cursor, compared with initial findings; (c) use of both a moving average and a trained neural network in order to minimize the jitter of the mouse cursor, which yield an average jittering reduction of 35%; (d) introduction of a new mathematical methodology to measure the degree of jittering of the mouse trajectory; (e) embedding an onscreen keyboard to facilitate text entry, and a graphical interface that is used to generate user profiles for system adaptability. ^ The adaptability nature of the interface is achieved through the establishment of user profiles, which may contain the jittering and voice characteristics of a particular user as well as a customized list of the most commonly used words ordered according to the user's preferences: in alphabetical or statistical order. This allows the system to successfully provide the capability of interacting with a computer. Every time any of the sub-system is retrained, the accuracy of the interface response improves even more. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

College personnel are required to provide accommodations for students who are deaf and hard of hearing (D/HoH), but few empirical studies have been conducted on D/HoH students as they learn under the various accommodation conditions (sign language interpreting, SLI, real-time captioning, RTC, and both). Guided by the experiences of students who are D/HoH at Miami-Dade College (MDC) who requested RTC in addition to SLI as accommodations, the researcher adopted Merten’s transformative-emancipatory theoretical framework that values perceptions and voice of students who are D/HoH. A mixed methods design addressed two research questions: Did student learning differ for each accommodation? What did students experience while learning through accommodations? Participants included 30 students who were D/HoH (60% women). They represented MDC’s majority minority population: 10% White (non-Hispanic), 20% Black (non-Hispanic, including Haitian/Caribbean), 67% Hispanic, and 3% other. Hearing loss, ranged from severe-profound (70%) to mild-moderate (30%). All were able to communicate with American Sign Language: Learning was measured while students who were D/HoH viewed three lectures under three accommodation conditions (SLI, RTC, SLI+RTC). The learning measure was defined as the difference in pre- and post-test scores on tests of the content presented in the lectures. Using repeated measure ANOVA and ANCOVA, confounding variables of fluency in American Sign Language and literacy skills were treated as covariates. Perceptions were obtained through interviews and verbal protocol analysis that were signed, videotaped, transcribed, coded, and examined for common themes and metacognitive strategies. No statistically significant differences were found among the three accommodations on the learning measure. Students who were D/HoH expressed thoughts about five different aspects of their learning while they viewed lectures: (a) comprehending the information, (b) feeling a part of the classroom environment, (c) past experiences with an accommodation, (d) individual preferences for an accommodation, (e) suggestions for improving an accommodation. They exhibited three metacognitive strategies: (a) constructing knowledge, (b) monitoring comprehension, and (c) evaluating information. No patterns were found in the types of metacognitive strategies used for any particular accommodation. The researcher offers recommendations for flexible applications of the standard accommodations used with students who are D/HoH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Biologists often need to assess whether unfamiliar datasets warrant the time investment required for more detailed exploration. Basing such assessments on brief descriptions provided by data publishers is unwieldy for large datasets that contain insights dependent on specific scientific questions. Alternatively, using complex software systems for a preliminary analysis may be deemed as too time consuming in itself, especially for unfamiliar data types and formats. This may lead to wasted analysis time and discarding of potentially useful data. Results: We present an exploration of design opportunities that the Google Maps interface offers to biomedical data visualization. In particular, we focus on synergies between visualization techniques and Google Maps that facilitate the development of biological visualizations which have both low-overhead and sufficient expressivity to support the exploration of data at multiple scales. The methods we explore rely on displaying pre-rendered visualizations of biological data in browsers, with sparse yet powerful interactions, by using the Google Maps API. We structure our discussion around five visualizations: a gene co-regulation visualization, a heatmap viewer, a genome browser, a protein interaction network, and a planar visualization of white matter in the brain. Feedback from collaborative work with domain experts suggests that our Google Maps visualizations offer multiple, scale-dependent perspectives and can be particularly helpful for unfamiliar datasets due to their accessibility. We also find that users, particularly those less experienced with computer use, are attracted by the familiarity of the Google Maps API. Our five implementations introduce design elements that can benefit visualization developers. Conclusions: We describe a low-overhead approach that lets biologists access readily analyzed views of unfamiliar scientific datasets. We rely on pre-computed visualizations prepared by data experts, accompanied by sparse and intuitive interactions, and distributed via the familiar Google Maps framework. Our contributions are an evaluation demonstrating the validity and opportunities of this approach, a set of design guidelines benefiting those wanting to create such visualizations, and five concrete example visualizations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the past several decades, we have experienced the tremendous growth, in both scale and scope, of real-time embedded systems, thanks largely to the advances in IC technology. However, the traditional approach to get performance boost by increasing CPU frequency has been a way of past. Researchers from both industry and academia are turning their focus to multi-core architectures for continuous improvement of computing performance. In our research, we seek to develop efficient scheduling algorithms and analysis methods in the design of real-time embedded systems on multi-core platforms. Real-time systems are the ones with the response time as critical as the logical correctness of computational results. In addition, a variety of stringent constraints such as power/energy consumption, peak temperature and reliability are also imposed to these systems. Therefore, real-time scheduling plays a critical role in design of such computing systems at the system level. We started our research by addressing timing constraints for real-time applications on multi-core platforms, and developed both partitioned and semi-partitioned scheduling algorithms to schedule fixed priority, periodic, and hard real-time tasks on multi-core platforms. Then we extended our research by taking temperature constraints into consideration. We developed a closed-form solution to capture temperature dynamics for a given periodic voltage schedule on multi-core platforms, and also developed three methods to check the feasibility of a periodic real-time schedule under peak temperature constraint. We further extended our research by incorporating the power/energy constraint with thermal awareness into our research problem. We investigated the energy estimation problem on multi-core platforms, and developed a computation efficient method to calculate the energy consumption for a given voltage schedule on a multi-core platform. In this dissertation, we present our research in details and demonstrate the effectiveness and efficiency of our approaches with extensive experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Toll plazas have several toll payment types such as manual, automatic coin machines, electronic and mixed lanes. In places with high traffic flow, the presence of toll plaza causes a lot of traffic congestion; this creates a bottleneck for the traffic flow, unless the correct mix of payment types is in operation. The objective of this research is to determine the optimal lane configuration for the mix of the methods of payment so that the waiting time in the queue at the toll plaza is minimized. A queuing model representing the toll plaza system and a nonlinear integer program have been developed to determine the optimal mix. The numerical results show that the waiting time can be decreased at the toll plaza by changing the lane configuration. For the case study developed an improvement in the waiting time as high as 96.37 percent was noticed during the morning peak hour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the incidence of corrosion and fretting in 48 retrieved titanium-6aluminum-4vanadium and/or cobalt-chromium-molybdenum modular total hip prosthesis with respect to alloy material microstructure and design parameters. The results revealed vastly different performance results for the wide array of microstructures examined. Severe corrosion/fretting was seen in 100% of as-cast, 24% of low carbon wrought, 9% of high carbon wrought and 5% of solution heat treated cobalt-chrome. Severe corrosion/fretting was observed in 60% of Ti-6Al-4V components. Design features which allow for fluid entry and stagnation, amplification of contact pressure and/or increased micromotion were also shown to play a role. 75% of prosthesis with high femoral head-trunnion offset exhibited poor performance compared to 15% with a low offset. Large femoral heads (>32mm) did not exhibit poor corrosion or fretting. Implantation time was not sufficient to cause poor performance; 54% of prosthesis with greater than 10 years in-vivo demonstrated none or mild corrosion/fretting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present research is carried out from the viewpoint of primarily space applications where human lives may be in danger if they are to work under these conditions. This work proposes to develop a one-degree-of-freedom (1-DOF) force-reflecting manual controller (FRMC) prototype for teleoperation, and address the effects of time delays commonly found in space applications where the control is accomplished via the earth-based control stations. To test the FRMC, a mobile robot (PPRK) and a slider-bar were developed and integrated to the 1-DOF FRMC. The software developed in Visual Basic is able to telecontrol any platform that uses an SV203 controller through the internet and it allows the remote system to send feedback information which may be in the form of visual or force signals. Time delay experiments were conducted on the platform and the effects of time delay on the FRMC system operation have been studied and delineated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effective control of production activities in dynamic job shop with predetermined resource allocation for all the jobs entering the system is a unique manufacturing environment, which exists in the manufacturing industry. In this thesis a framework for an Internet based real time shop floor control system for such a dynamic job shop environment is introduced. The system aims to maintain the schedule feasibility of all the jobs entering the manufacturing system under any circumstance. The system is capable of deciding how often the manufacturing activities should be monitored to check for control decisions that need to be taken on the shop floor. The system will provide the decision maker real time notification to enable him to generate feasible alternate solutions in case a disturbance occurs on the shop floor. The control system is also capable of providing the customer with real time access to the status of the jobs on the shop floor. The communication between the controller, the user and the customer is through web based user friendly GUI. The proposed control system architecture and the interface for the communication system have been designed, developed and implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project will create a blueprint for an eco-settlement in the South Florida area that will house a total of ten people on a five acre plot of land. Using a combination of permaculture design principles and simple tech solutions, this settlement will be capable of supporting its inhabitants “off the grid”, or with no reliance on outside public utilities. More specifically, the settlement will be capable of supporting its inhabitants by producing a minimum of 80% of its resources on site using only all natural farming techniques, proper water management, companion planting and various other techniques. Instead of using modern agricultural practices that damage the land in the long term, all of the principles that will be used in this design will allow the land to heal over time while still producing most of what is needed for any inhabitants. This project is significant because all of the principles used are time-tested and capable of adapting to any type of environment. The principles used do not have a steep learning curve. In fact, anyone ranging from a kindergartener to a 90-year-old will be able to learn and teach the required skill sets in as little as one week. By contrast, in order to fully utilize GMO crops, first a scientist must spend a minimum of eight years getting degrees in various fields of study to even reach a textbook understanding of the exact science. In addition, several years must be spent in sterile lab environments to see the results of complex experiments that may not even be used, thus wasting time, money, and resources. I am certain that this will be successful because similar goals have already been reached by people around the world without access to modern utilities. The only major difference is that my approach will be documented scientifically, and show that not only is this way of life healthy, but also easy and practical. Because of this, my project will bring permaculture design into the spotlight, and will hopefully see widespread adoption. This will result in cities designed with the intent to live with nature instead of conquering it, and will hopefully aid the earth in healing for future generations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research into the dynamicity of job performance criteria has found evidence suggesting the presence of rank-order changes to job performance scores across time as well as intraindividual trajectories in job performance scores across time. These findings have influenced a large body of research into (a) the dynamicity of validities of individual differences predictors of job performance and (b) the relationship between individual differences predictors of job performance and intraindividual trajectories of job performance. In the present dissertation, I addressed these issues within the context of the Five Factor Model of personality. The Five Factor Model is arranged hierarchically, with five broad higher-order factors subsuming a number of more narrowly tailored personality facets. Research has debated the relative merits of broad versus narrow traits for predicting job performance, but the entire body of research has addressed the issue from a static perspective -- by examining the relative magnitude of validities of global factors versus their facets. While research along these lines has been enlightening, theoretical perspectives suggest that the validities of global factors versus their facets may differ in their stability across time. Thus, research is needed to not only compare the relative magnitude of validities of global factors versus their facets at a single point in time, but also to compare the relative stability of validities of global factors versus their facets across time. Also necessary to advance cumulative knowledge concerning intraindividual performance trajectories is research into broad vs. narrow traits for predicting such trajectories. In the present dissertation, I addressed these issues using a four-year longitudinal design. The results indicated that the validities of global conscientiousness were stable across time, while the validities of conscientiousness facets were more likely to fluctuate. However, the validities of emotional stability and extraversion facets were no more likely to fluctuate across time than those of the factors. Finally, while some personality factors and facets predicted performance intercepts (i.e., performance at the first measurement occasion), my results failed to indicate a significant effect of any personality variable on performance growth. Implications for research and practice are discussed.