957 resultados para Computer Prediction Program


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Mini-Numerical Electromagnetic Code (MININEC) program, a PC-Compatible version of the powerful NEC program, is used to design a new type of reduced-size antenna. The validity of the program to model simple well-known antennas, such as dipoles and monopoles, is first shown. More complex geometries such as folded dipoles, and meander dipole antennas are also analysed using the program. The final design geometry of a meander folded dipole is characterized with MININEC, yielding results that serve as the basis for the practical construction of the antenna. Finally, the laboratory work with a prototype antenna is described, and practical results are presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Minimum Student Performance Standards in Computer Literacy and Science were passed by the Florida Legislature through the Educational Reform Act of 1983. This act mandated that all Florida high school graduates receive training in computer literacy. Schools and school systems were charged with the task of determining the best methods to deliver this instruction to their students. The scope of this study is to evaluate one school's response to the state of Florida's computer literacy mandate. The study was conducted at Miami Palmetto Senior High School, located in Dade County, Florida. The administration of Miami Palmetto Senior High School chose to develop and implement a new program to comply with the state mandate - integrating computer literacy into the existing biology curriculum. The study evaluated the curriculum to determine if computer literacy could be integrated successfully and meet both the biology and computer literacy objectives. The findings in this study showed that there were no significant differences between biology scores of the students taking the integrated curriculum and those taking a traditional curriculum of biology. Student in the integrated curriculum not only met the biology objectives as well as those in the traditional curriculum, they also successfully completed the intended objectives for computer literacy. Two sets of objectives were successfully completed in the integrated classes in the same amount of time used to complete one set of objectives in the traditional biology classes. Therefore, integrated curriculum was the more efficient means of meeting the intended objectives of both biology and computer literacy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brain-computer interfaces (BCI) have the potential to restore communication or control abilities in individuals with severe neuromuscular limitations, such as those with amyotrophic lateral sclerosis (ALS). The role of a BCI is to extract and decode relevant information that conveys a user's intent directly from brain electro-physiological signals and translate this information into executable commands to control external devices. However, the BCI decision-making process is error-prone due to noisy electro-physiological data, representing the classic problem of efficiently transmitting and receiving information via a noisy communication channel.

This research focuses on P300-based BCIs which rely predominantly on event-related potentials (ERP) that are elicited as a function of a user's uncertainty regarding stimulus events, in either an acoustic or a visual oddball recognition task. The P300-based BCI system enables users to communicate messages from a set of choices by selecting a target character or icon that conveys a desired intent or action. P300-based BCIs have been widely researched as a communication alternative, especially in individuals with ALS who represent a target BCI user population. For the P300-based BCI, repeated data measurements are required to enhance the low signal-to-noise ratio of the elicited ERPs embedded in electroencephalography (EEG) data, in order to improve the accuracy of the target character estimation process. As a result, BCIs have relatively slower speeds when compared to other commercial assistive communication devices, and this limits BCI adoption by their target user population. The goal of this research is to develop algorithms that take into account the physical limitations of the target BCI population to improve the efficiency of ERP-based spellers for real-world communication.

In this work, it is hypothesised that building adaptive capabilities into the BCI framework can potentially give the BCI system the flexibility to improve performance by adjusting system parameters in response to changing user inputs. The research in this work addresses three potential areas for improvement within the P300 speller framework: information optimisation, target character estimation and error correction. The visual interface and its operation control the method by which the ERPs are elicited through the presentation of stimulus events. The parameters of the stimulus presentation paradigm can be modified to modulate and enhance the elicited ERPs. A new stimulus presentation paradigm is developed in order to maximise the information content that is presented to the user by tuning stimulus paradigm parameters to positively affect performance. Internally, the BCI system determines the amount of data to collect and the method by which these data are processed to estimate the user's target character. Algorithms that exploit language information are developed to enhance the target character estimation process and to correct erroneous BCI selections. In addition, a new model-based method to predict BCI performance is developed, an approach which is independent of stimulus presentation paradigm and accounts for dynamic data collection. The studies presented in this work provide evidence that the proposed methods for incorporating adaptive strategies in the three areas have the potential to significantly improve BCI communication rates, and the proposed method for predicting BCI performance provides a reliable means to pre-assess BCI performance without extensive online testing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Safeguarding organizations against opportunism and severe deception in computer-mediated communication (CMC) presents a major challenge to CIOs and IT managers. New insights into linguistic cues of deception derive from the speech acts innate to CMC. Applying automated text analysis to archival email exchanges in a CMC system as part of a reward program, we assess the ability of word use (micro-level), message development (macro-level), and intertextual exchange cues (meta-level) to detect severe deception by business partners. We empirically assess the predictive ability of our framework using an ordinal multilevel regression model. Results indicate that deceivers minimize the use of referencing and self-deprecation but include more superfluous descriptions and flattery. Deceitful channel partners also over structure their arguments and rapidly mimic the linguistic style of the account manager across dyadic e-mail exchanges. Thanks to its diagnostic value, the proposed framework can support firms’ decision-making and guide compliance monitoring system development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Interventions to increase cooking skills (CS) and food skills (FS) as a route to improving overall diet are popular within public health. This study tested a comprehensive model of diet quality by assessing the influence of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. The correspondence of two measures of diet quality further validated the Eating Choices Index (ECI) for use in quantitative research.
Methods: A cross-sectional survey was conducted in a quota-controlled nationally representative sample of 1049 adults aged 20–60 years drawn from the Island of Ireland. Surveys were administered in participants’ homes via computer-assisted personal interviewing (CAPI) assessing a range of socio-demographic, knowledge- and psychological-related variables alongside perceived CS and FS abilities. Regression models were used to model factors influencing diet quality. Correspondence between 2 measures of diet quality was assessed using chi-square and Pearson correlations.
Results: ECI score was significantly negatively correlated with DINE Fat intake (r = -0.24, p < 0.001), and ECI score was significantly positively correlated with DINE Fibre intake (r = 0.38, p < 0.001), demonstrating a high agreement. Findings indicated that males, younger respondents and those with no/few educational qualifications scored significantly lower on both CS and FS abilities. The relative influence of socio-demographic, knowledge, psychological variables and CS and FS abilities on dietary outcomes varied, with regression models explaining 10–20 % of diet quality variance. CS ability exerted the strongest relationship with saturated fat intake (β = -0.296, p < 0.001) and was a significant predictor of fibre intake (β = -0.113, p < 0.05), although not for healthy food choices (ECI) (β = 0.04, p > 0.05).
Conclusion: Greater CS and FS abilities may not lead directly to healthier dietary choices given the myriad of other factors implicated; however, CS appear to have differential influences on aspects of the diet, most notably in relation to lowering saturated fat intake. Findings suggest that CS and FS should not be singular targets of interventions designed to improve diet; but targeting specific sub-groups of the population e.g. males, younger adults, those with limited education might be more fruitful. A greater understanding of the interaction of factors influencing cooking and food practices within the home is needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of new learning models has been of great importance throughout recent years, with a focus on creating advances in the area of deep learning. Deep learning was first noted in 2006, and has since become a major area of research in a number of disciplines. This paper will delve into the area of deep learning to present its current limitations and provide a new idea for a fully integrated deep and dynamic probabilistic system. The new model will be applicable to a vast number of areas initially focusing on applications into medical image analysis with an overall goal of utilising this approach for prediction purposes in computer based medical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report is the product of a first-year research project in the University Transportation Centers Program. This project was carried out by an interdisciplinary research team at The University of Iowa's Public Policy Center. The project developed a computerized system to support decisions on locating facilities that serve rural areas while minimizing transportation costs. The system integrates transportation databases with algorithms that specify efficient locations and allocate demand efficiently to service regions; the results of these algorithms are used interactively by decision makers. The authors developed documentation for the system so that others could apply it to estimate the transportation and route requirements of alternative locations and identify locations that meet certain criteria with the least cost. The system was developed and tested on two transportation-related problems in Iowa, and this report uses these applications to illustrate how the system can be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Property and Equipment Department has a central supply of automotive parts, tools, and maintenance supplies. This central supply is used to supply the repair shop and also to supply parts to the various field garages and all departments of the Commission. The old procedure involved keeping track manually of all of the parts, which involved some 22,000 items. All records, billings, arid re-order points were kept manually. Mani times the re-order points were located by reaching into a bin and finding nothing there. Desiring to improve this situation, an inventory control system was established for use on the computer. A complete record of the supplies that are stored in the central warehouse was prepared and this information was used to make a catalog. Each time an item is issued or received, it is processed through the inventory program. When the re-order point is reached, a notice is given to reorder. The procedure for taking inventory has been improved. A voucher invoice is now prepared by the computer for all issues to departments. These are some of the many benefits that have been de rived from this system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Data Processing Department of ISHC has developed coding forms to be used for the data to be entered into the program. The Highway Planning and Programming and the Design Departments are responsible for coding and submitting the necessary data forms to Data Processing for the noise prediction on the highway sections.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-07

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract : Since at least the 1980's, a growing number of companies have set up an ethics or a compliance program within their organization. However, in the field of study of business management, there is a paucity of research studies concerning these management systems. This observation warranted the present investigation of one company's compliance program. Compliance programs are set up so that individuals working within an organization observe the laws and regulations which pertain to their work. This study used a constructivist grounded theory methodology to examine the process by which a specific compliance program, that of Siemens Canada Limited, was implemented throughout its organization. In conformity with this methodology, instead of proceeding with the investigation in accordance to a particular theoretical framework, the study established a number of theoretical constructs used strictly as reference points. The study's research question was stated as: what are the characteristics of the process by which Siemens' compliance program integrated itself into the existing organizational structure and gained employee acceptance? Data consisted of documents produced by the company and of interviews done with twenty-four managers working for Siemens Canada Limited. The researcher used QSR-Nvivo computer assisted software to code transcripts and to help with analyzing interviews and documents. Triangulation was done by using a number of analysis techniques and by constantly comparing findings with extant theory. A descriptive model of the implementation process grounded in the experience of participants and in the contents of the documents emerged from the data. The process was called "Remolding"; remolding being the core category having emerged. This main process consisted of two sub-processes identified as "embedding" and "appraising." The investigation was able to provide a detailed account of the appraising process. It identified that employees appraised the compliance program according to three facets: the impact of the program on the employee's daily activities, the relationship employees have with the local compliance organization, and the relationship employees have with the corporate ethics identity. The study suggests that a company who is entertaining the idea of implementing a compliance program should consider all three facets. In particular, it suggests that any company interested in designing and implementing a compliance program should pay particular attention to its corporate ethics identity. This is because employee's acceptance of the program is influenced by their comparison of the company's ethics identity to their local ethics identity. Implications of the study suggest that personnel responsible for the development and organizational support of a compliance program should understand the appraisal process by which employees build their relationship with the program. The originality of this study is that it points emphatically that companies must pay special attention in developing a corporate ethics identify which is coherent, well documented and well explained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We explore the relationships between the construction of a work of art and the crafting of a computer program in Java and suggest that the structure of paintings and drawings may be used to teach the fundamental concepts of computer programming. This movement "from Art to Science", using art to drive computing, complements the common use of computing to inform art. We report on initial experiences using this approach with undergraduate and postgraduate students. An embryonic theory of the correspondence between art and computing is presented and a methodology proposed to develop this project further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to law number 12.715/2012, Brazilian government instituted guidelines for a program named Inovar-Auto. In this context, energy efficiency is a survival requirement for Brazilian automotive industry from September 2016. As proposed by law, energy efficiency is not going to be calculated by models only. It is going to be calculated by the whole universe of new vehicles registered. In this scenario, the composition of vehicles sold in market will be a key factor on profits of each automaker. Energy efficiency and its consequences should be taken into consideration in all of its aspects. In this scenario, emerges the following question: which is the efficiency curve of one automaker for long term, allowing them to adequate to rules, keep balancing on investment in technologies, increasing energy efficiency without affecting competitiveness of product lineup? Among several variables to be considered, one can highlight the analysis of manufacturing costs, customer value perception and market share, which characterizes this problem as a multi-criteria decision-making. To tackle the energy efficiency problem required by legislation, this paper proposes a framework of multi-criteria decision-making. The proposed framework combines Delphi group and Analytic Hierarchy Process to identify suitable alternatives for automakers to incorporate in main Brazilian vehicle segments. A forecast model based on artificial neural networks was used to estimate vehicle sales demand to validate expected results. This approach is demonstrated with a real case study using public vehicles sales data of Brazilian automakers and public energy efficiency data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.