832 resultados para video as a research tool
Resumo:
Tool life is an important factor to be considered during the optimisation of a machining process since cutting parameters can be adjusted to optimise tool changing, reducing cost and time of production. Also the performance of a tool is directly linked to the generated surface roughness and this is important in cases where there are strict surface quality requirements. The prediction of tool life and the resulting surface roughness in milling operations has attracted considerable research efforts. The research reported herein is focused on defining the influence of milling cutting parameters such as cutting speed, feed rate and axial depth of cut, on three major tool performance parameters namely, tool life, material removal and surface roughness. The research is seeking to define methods that will allow the selection of optimal parameters for best tool performance when face milling 416 stainless steel bars. For this study the Taguchi method was applied in a special design of an orthogonal array that allows studying the entire parameter space with only a number of experiments representing savings in cost and time of experiments. The findings were that the cutting speed has the most influence on tool life and surface roughness and very limited influence on material removal. By last tool life can be judged either from tool life or volume of material removal.
Resumo:
2010 Mathematics Subject Classification: 65D18.
Resumo:
The Joint Research Centre (JRC) of the European Commission has developed, in consultation with many partners, the DOPA as a global reference information system to support decision making on protected areas (PAs) and biodiversity conservation. The DOPA brings together the World Database on Protected Areas with other reference datasets on species, habitats, ecoregions, threats and pressures, to deliver critical indicators at country level and PA level that can inform gap analyses, PA planning and reporting. These indicators are especially relevant to Aichi Targets 11 and 12, and have recently contributed to CBD country dossiers and capacity building on these targets. DOPA also includes eConservation, a new module that provides a means to share and search information on conservation projects, and thus allows users to see “who is doing what where”. So far over 5000 projects from the World Bank, GEF, CEPF, EU LIFE Programme, CBD LifeWeb Initiative and others have been included, and these projects can be searched in an interactive mapping interface based on criteria such as location, objectives, timeframe, budget, the organizations involved, target species etc. This seminar will provide an introduction to DOPA and eConservation, highlight how these services are used by the CBD and others, and include ample time for discussion.
Resumo:
Organizations can use the valuable tool of data envelopment analysis (DEA) to make informed decisions on developing successful strategies, setting specific goals, and identifying underperforming activities to improve the output or outcome of performance measurement. The Handbook of Research on Strategic Performance Management and Measurement Using Data Envelopment Analysis highlights the advantages of using DEA as a tool to improve business performance and identify sources of inefficiency in public and private organizations. These recently developed theories and applications of DEA will be useful for policymakers, managers, and practitioners in the areas of sustainable development of our society including environment, agriculture, finance, and higher education sectors. All rights reserved.
Resumo:
Mexico's double transition—democratisation and internationalisation—offers a good case study to analyse the interaction between internationalisation processes and domestic developments during transitions to democracy. This article explains how the specific way in which Mexico linked with North America worked as a causal mechanism during the country's democratisation. In the end, an inadequate project of internationalisation—spearheaded by the North American Free Trade Agreement (NAFTA)—failed to fulfill its democratising potential.
Resumo:
The traditional way of understanding television content consumption and viewer reactions may be simply summarised: information about the program, viewing at airing time, and interpersonal discussion after the program. In our digital media environment due to crossmedia consumption and platform shifts, the actual trend in audiovisual, and traditionally television content consumption is changing, the viewer’s journey is different across contents and platforms. Content is becoming independent from the platform and the medium is increasingly in the hands of technologically empowered viewers. Our objective is to uncover how traditional content expressly manufactured for television (series, reality shows, sports) can now be consumed via other platforms, and how and to what extent audiovisual content consumption is complemented or replaced by other forms (text, audio). In our exploratory research we identify the typical patterns of interaction and synergies of consumption across classical media content. In this study we used a multimethodology qualitative research design with three research phases including focus groups, online content analysis, and viewers’ narratives. Overall, the Video Star stays alive, but has to deal with immediate reactions and has to change according to his or her audiences’ wishes
Resumo:
This exploratory study of a classroom with mentoring and neutral e-mail was conducted in a public commuter state university in South Florida between January 1996 and April 1996. Sixteen males and 83 females from four graduate level educational research classes participated in the study.^ Two main hypotheses were tested. Hypothesis One was that those students receiving mentoring e-mail messages would score significantly higher on an instrument measuring attitude toward educational research (ATERS) than those not receiving mentoring e-mail messages. Hypothesis Two was that those students receiving mentoring e-mail would score significantly higher on objective exams covering the educational research material than those not receiving mentoring e-mail.^ Results of factorial analyses of variance showed no significant differences between the treatment groups in achievement or in attitudes toward educational research. Introverts had lower attitudes and lower final exam grades in both groups, although introverts in the mentored group scored higher than those introverts in the neutral group.^ A t test of the means of total response to e-mail from the researcher showed a significant difference between the mentored and neutral e-mail groups. Introverts responded more often than extraverts in both groups.^ Teacher effect was significant in determining class response to e-mail messages. Responses were most frequent in the researcher's classes.^ Qualitative analyses of the e-mail and course evaluation survey and of the content of e-mail messages received by the researcher were then grouped into basic themes and discussed.^ A qualitative analysis of an e-mail and course evaluation survey revealed that students from both the neutral and mentoring e-mail groups appreciated teacher feedback. A qualitative analysis of the mentoring and neutral e-mail replies divided the responses into those pertaining to the class, such as test and research paper questions, and more personal items, such as problems in the class and personal happenings.^ At this point in time, e-mail is not a standard way of communicating in classes in the college of education at this university. As this technology tool of communication becomes more popular, it is anticipated that replications of this study will be warranted. ^
Resumo:
This study examines the congruency of planning between organizational structure and process, through an evaluation and planning model known as the Micro/Macro Dynamic Planning Grid. The model compares day-to-day planning within an organization to planning imposed by organizational administration and accrediting agencies. A survey instrument was developed to assess the micro and macro sociological analysis elements utilized by an organization.^ The Micro/Macro Dynamic Planning Grid consists of four quadrants. Each quadrant contains characteristics that reflect the interaction between the micro and macro elements of planning, objectives and goals within an organization. The Over Macro/Over Micro, Quadrant 1, contains attributes that reflect a tremendous amount of action and ongoing adjustments, typical of an organization undergoing significant changes in either leadership, program and/or structure. Over Macro/Under Micro, Quadrant 2, reflects planning characteristics found in large, bureaucratic systems with little regard given to the workings of their component parts. Under Macro/Under Micro, Quadrant 3, reflects the uncooperative, uncoordinated organization, one that contains a multiplicity of viewpoints, language, objectives and goals. Under Macro/Under Micro, Quadrant 4 represents the worst case scenario for any organization. The attributes of this quadrant are very reactive, chaotic, non-productive and redundant.^ There were three phases to the study: development of the initial instrument, pilot testing the initial instrument and item revision, and administration and assessment of the refined instrument. The survey instrument was found to be valid and reliable for the purposes and audiences herein described.^ In order to expand the applicability of the instrument to other organizational settings, the survey was administered to three professional colleges within a university.^ The first three specific research questions collectively answered, in the affirmative, the basic research question: Can the Micro/Macro Dynamic Planning Grid be applied to an organization through an organizational development tool? The first specific question: Can an instrument be constructed that applies the Micro/Macro Dynamic Planning Grid? The second specific research question: Is the constructed instrument valid and reliable? The third specific research question: Does an instrument that applies the Micro/Macro Dynamic Planning Grid assess congruency of micro and macro planning, goals and objectives within an organization? The fourth specific research question: What are the differences in the responses based on roles and responsibilities within an organization? involved statistical analysis of the response data and comparisons obtained with the demographic data. (Abstract shortened by UMI.) ^
Resumo:
This dissertation examines whether-there exists financial constraints and, if so, their implications for investment in research and development expenditures. It develops a theoretical model of credit rationing and research and development in which both are determined simultaneously and endogenously. The model provides a useful tool to examine different policies that may help alleviate the negative the effect of financial constraints faced by firms.^ The empirical evidence presented deals with two different cases, namely, the motor vehicle industry in Germany (1970-1990) and the electrical machinery industry In Spain (1975-1990).^ The innovation in the empirical analysis is that it follows a novel approach to identify events that allow us to isolate the effect of financial constraints in the determination of research and development.^ Further, empirical evidence is presented to show that in the above two cases financial constraints affect investment in physical capital as well.^ The empirical evidence presented supports the results of the theoretical model developed in this dissertation, showing that financial constraints negatively affect the rate of growth of innovation by reducing the intensity of research and development activity. ^
Resumo:
The integration of automation (specifically Global Positioning Systems (GPS)) and Information and Communications Technology (ICT) through the creation of a Total Jobsite Management Tool (TJMT) in construction contractor companies can revolutionize the way contractors do business. The key to this integration is the collection and processing of real-time GPS data that is produced on the jobsite for use in project management applications. This research study established the need for an effective planning and implementation framework to assist construction contractor companies in navigating the terrain of GPS and ICT use. An Implementation Framework was developed using the Action Research approach. The framework consists of three components, as follows: (i) ICT Infrastructure Model, (ii) Organizational Restructuring Model, and (iii) Cost/Benefit Analysis. The conceptual ICT infrastructure model was developed for the purpose of showing decision makers within highway construction companies how to collect, process, and use GPS data for project management applications. The organizational restructuring model was developed to assist companies in the analysis and redesign of business processes, data flows, core job responsibilities, and their organizational structure in order to obtain the maximum benefit at the least cost in implementing GPS as a TJMT. A cost-benefit analysis which identifies and quantifies the cost and benefits (both direct and indirect) was performed in the study to clearly demonstrate the advantages of using GPS as a TJMT. Finally, the study revealed that in order to successfully implement a program to utilize GPS data as a TJMT, it is important for construction companies to understand the various implementation and transitioning issues that arise when implementing this new technology and business strategy. In the study, Factors for Success were identified and ranked to allow a construction company to understand the factors that may contribute to or detract from the prospect for success during implementation. The Implementation Framework developed as a result of this study will serve to guide highway construction companies in the successful integration of GPS and ICT technologies for use as a TJMT.
Resumo:
This dissertation established a state-of-the-art programming tool for designing and training artificial neural networks (ANNs) and showed its applicability to brain research. The developed tool, called NeuralStudio, allows users without programming skills to conduct studies based on ANNs in a powerful and very user friendly interface. A series of unique features has been implemented in NeuralStudio, such as ROC analysis, cross-validation, network averaging, topology optimization, and optimization of the activation function’s slopes. It also included a Support Vector Machines module for comparison purposes. Once the tool was fully developed, it was applied to two studies in brain research. In the first study, the goal was to create and train an ANN to detect epileptic seizures from subdural EEG. This analysis involved extracting features from the spectral power in the gamma frequencies. In the second application, a unique method was devised to link EEG recordings to epileptic and nonepileptic subjects. The contribution of this method consisted of developing a descriptor matrix that can be used to represent any EEG file regarding its duration and the number of electrodes. The first study showed that the inter-electrode mean of the spectral power in the gamma frequencies and its duration above a specific threshold performs better than the other frequencies in seizure detection, exhibiting an accuracy of 95.90%, a sensitivity of 92.59%, and a specificity of 96.84%. The second study yielded that Hjorth’s parameter activity is sufficient to accurately relate EEG to epileptic and non-epileptic subjects. After testing, accuracy, sensitivity and specificity of the classifier were all above 0.9667. Statistical tests measured the superiority of activity at over 99.99 % certainty. It was demonstrated that (1) the spectral power in the gamma frequencies is highly effective in locating seizures from EEG and (2) activity can be used to link EEG recordings to epileptic and non-epileptic subjects. These two studies required high computational load and could be addressed thanks to NeuralStudio. From a medical perspective, both methods proved the merits of NeuralStudio in brain research applications. For its outstanding features, NeuralStudio has been recently awarded a patent (US patent No. 7502763).
Resumo:
This dissertation introduces a novel automated book reader as an assistive technology tool for persons with blindness. The literature shows extensive work in the area of optical character recognition, but the current methodologies available for the automated reading of books or bound volumes remain inadequate and are severely constrained during document scanning or image acquisition processes. The goal of the book reader design is to automate and simplify the task of reading a book while providing a user-friendly environment with a realistic but affordable system design. This design responds to the main concerns of (a) providing a method of image acquisition that maintains the integrity of the source (b) overcoming optical character recognition errors created by inherent imaging issues such as curvature effects and barrel distortion, and (c) determining a suitable method for accurate recognition of characters that yields an interface with the ability to read from any open book with a high reading accuracy nearing 98%. This research endeavor focuses in its initial aim on the development of an assistive technology tool to help persons with blindness in the reading of books and other bound volumes. But its secondary and broader aim is to also find in this design the perfect platform for the digitization process of bound documentation in line with the mission of the Open Content Alliance (OCA), a nonprofit Alliance at making reading materials available in digital form. The theoretical perspective of this research relates to the mathematical developments that are made in order to resolve both the inherent distortions due to the properties of the camera lens and the anticipated distortions of the changing page curvature as one leafs through the book. This is evidenced by the significant increase of the recognition rate of characters and a high accuracy read-out through text to speech processing. This reasonably priced interface with its high performance results and its compatibility to any computer or laptop through universal serial bus connectors extends greatly the prospects for universal accessibility to documentation.
Resumo:
The safety of workers in nighttime roadway work zones has become a major concern for state transportation agencies due to the increase in the number of work zone fatalities. During the last decade, several studies have focused on the improvement of safety in nighttime roadway work zones; but the element that is still missing is a set of tools for translating the research results into practice. This paper discusses: 1) the importance of translating the research results related to the safety of workers and safety planning of nighttime work zones into practice, and 2) examples of tools that can be used for translating the results of such studies into practice. A tool that can propose safety recommendations in nighttime work zones and a web-based safety training tool for workers are presented in this paper. The tools were created as a component of a five-year research study on the assessment of the safety of nighttime roadway construction. The objectives of both tools are explained as well as their functionalities (i.e., what the tools can do for the users); their components (e.g., knowledge base, database, and interfaces); and their structures (i.e., how the components of the tools are organized to meet the objectives). Evaluations by the proposed users of each tool are also presented.
Resumo:
Expertise in physics has been traditionally studied in cognitive science, where physics expertise is understood through the difference between novice and expert problem solving skills. The cognitive perspective of physics experts only create a partial model of physics expertise and does not take into account the development of physics experts in the natural context of research. This dissertation takes a social and cultural perspective of learning through apprenticeship to model the development of physics expertise of physics graduate students in a research group. I use a qualitative methodological approach of an ethnographic case study to observe and video record the common practices of graduate students in their biophysics weekly research group meetings. I recorded notes on observations and conduct interviews with all participants of the biophysics research group for a period of eight months. I apply the theoretical framework of Communities of Practice to distinguish the cultural norms of the group that cultivate physics expert practices. Results indicate that physics expertise is specific to a topic or subfield and it is established through effectively publishing research in the larger biophysics research community. The participant biophysics research group follows a learning trajectory for its students to contribute to research and learn to communicate their research in the larger biophysics community. In this learning trajectory students develop expert member competencies to learn to communicate their research and to learn the standards and trends of research in the larger research community. Findings from this dissertation expand the model of physics expertise beyond the cognitive realm and add the social and cultural nature of physics expertise development. This research also addresses ways to increase physics graduate student success towards their PhD. and decrease the 48% attrition rate of physics graduate students. Cultivating effective research experiences that give graduate students agency and autonomy beyond their research groups gives students the motivation to finish graduate school and establish their physics expertise.^
Resumo:
The purpose of this study is to identify research trends in Merger and Acquisition waves in the restaurant industry and propose future research directions by thoroughly reviewing existing Merger and Acquisition related literature. Merger and Acquisition has been extensively used as a strategic management tool for fast growth in the restaurant industry. However, there has been a very limited amount of literature that focuses on Merger & Acquisition in the restaurant industry. Particular, no known study has been identified that examined M&A wave and its determinants. A good understanding of determinants of M&A wave will help practitioners identify important factors that should be considered before making M&A decisions and predict the optimal timing for successful M&A transactions. This study examined literature on six U.S M&A waves and their determinants and summarized main explanatory factors examined, statistical methods, and theoretical frameworks. Inclusion of unique macroeconomic factors of the restaurant industry and the use of factor analysis are suggested for future research.