965 resultados para Matlab (Computer programs)
Resumo:
Objective: The aim of this study was to compare the factors of adherence to physical activity in subjects attending a cardiac rehabilitation program, and subjects who have withdrawal this same program using the Transtheoretical Model of behavior change. Methods: We conducted an observational, cross sectional type study, with a sample of 33 individuals (15 currently participating in the Cardiac Rehabilitation Program and 18 who no more attended the same program), with the questionnaires being personally delivered or sent by mail. For data analysis, we used the computer program SPSS® version 16.0. The significance level was set at 0.05. Results: There were no significant differences in the states of Change, Self-efficacy, Decisional Balance and Change Processes in both groups. We obtained a high Spearman correlation between States of Change and Self-efficacy (r2 = 0.778) and the Pros (r2 = 0.764) and Againsts (r2 = -0.744) in Decisional Balance. However, there were no significant evidence to affirm that States of Change and experiential processes of change (p = 0.465) andbehavioral (p = 0.300) had a correlation. A relationship was found, in terms of proportions between physical activity incorporated or not in a Cardiac Rehabilitation Program and age (p = 0.003), occupation (p = 0.010) and the entity paying the costs of program (p = 0.027). Conclusion: It was concluded that perceived self-efficacy and Pros and Againsts of the Decisional Balance are related to adherence to physical activity. Results also point out that age, profession and the entity paying the costs of the program influences the dropout of Cardiac Rehabilitation Programs.
Resumo:
We propose a fractional model for computer virus propagation. The model includes the interaction between computers and removable devices. We simulate numerically the model for distinct values of the order of the fractional derivative and for two sets of initial conditions adopted in the literature. We conclude that fractional order systems reveal richer dynamics than the classical integer order counterpart. Therefore, fractional dynamics leads to time responses with super-fast transients and super-slow evolutions towards the steady-state, effects not easily captured by the integer order models.
Resumo:
Currently, due to the widespread use of computers and the internet, students are trading libraries for the World Wide Web and laboratories with simulation programs. In most courses, simulators are made available to students and can be used to proof theoretical results or to test a developing hardware/product. Although this is an interesting solution: low cost, easy and fast way to perform some courses work, it has indeed major disadvantages. As everything is currently being done with/in a computer, the students are loosing the “feel” of the real values of the magnitudes. For instance in engineering studies, and mainly in the first years, students need to learn electronics, algorithmic, mathematics and physics. All of these areas can use numerical analysis software, simulation software or spreadsheets and in the majority of the cases data used is either simulated or random numbers, but real data could be used instead. For example, if a course uses numerical analysis software and needs a dataset, the students can learn to manipulate arrays. Also, when using the spreadsheets to build graphics, instead of using a random table, students could use a real dataset based, for instance, in the room temperature and its variation across the day. In this work we present a framework which uses a simple interface allowing it to be used by different courses where the computers are the teaching/learning process in order to give a more realistic feeling to students by using real data. A framework is proposed based on a set of low cost sensors for different physical magnitudes, e.g. temperature, light, wind speed, which are connected to a central server, that the students have access with an Ethernet protocol or are connected directly to the student computer/laptop. These sensors use the communication ports available such as: serial ports, parallel ports, Ethernet or Universal Serial Bus (USB). Since a central server is used, the students are encouraged to use sensor values results in their different courses and consequently in different types of software such as: numerical analysis tools, spreadsheets or simply inside any programming language when a dataset is needed. In order to do this, small pieces of hardware were developed containing at least one sensor using different types of computer communication. As long as the sensors are attached in a server connected to the internet, these tools can also be shared between different schools. This allows sensors that aren't available in a determined school to be used by getting the values from other places that are sharing them. Another remark is that students in the more advanced years and (theoretically) more know how, can use the courses that have some affinities with electronic development to build new sensor pieces and expand the framework further. The final solution provided is very interesting, low cost, simple to develop, allowing flexibility of resources by using the same materials in several courses bringing real world data into the students computer works.
Resumo:
Currently, the teaching-learning process in domains, such as computer programming, is characterized by an extensive curricula and a high enrolment of students. This poses a great workload for faculty and teaching assistants responsible for the creation, delivery, and assessment of student exercises. The main goal of this chapter is to foster practice-based learning in complex domains. This objective is attained with an e-learning framework—called Ensemble—as a conceptual tool to organize and facilitate technical interoperability among services. The Ensemble framework is used on a specific domain: computer programming. Content issues are tacked with a standard format to describe programming exercises as learning objects. Communication is achieved with the extension of existing specifications for the interoperation with several systems typically found in an e-learning environment. In order to evaluate the acceptability of the proposed solution, an Ensemble instance was validated on a classroom experiment with encouraging results.
Resumo:
Teaching and learning computer programming is as challenging as difficult. Assessing the work of students and providing individualised feedback to all is time-consuming and error prone for teachers and frequently involves a time delay. The existent tools and specifications prove to be insufficient in complex evaluation domains where there is a greater need to practice. At the same time Massive Open Online Courses (MOOC) are appearing revealing a new way of learning, more dynamic and more accessible. However this new paradigm raises serious questions regarding the monitoring of student progress and its timely feedback. This paper provides a conceptual design model for a computer programming learning environment. This environment uses the portal interface design model gathering information from a network of services such as repositories and program evaluators. The design model includes also the integration with learning management systems, a central piece in the MOOC realm, endowing the model with characteristics such as scalability, collaboration and interoperability. This model is not limited to the domain of computer programming and can be adapted to any complex area that requires systematic evaluation with immediate feedback.
Resumo:
Neste documento descreve-se o projeto desenvolvido na unidade curricular de Tese e Dissertação durante o 2º ano do Mestrado de Engenharia Eletrotécnica e de Computadores no ramo de Automação e Sistemas, no Departamento de Engenharia Eletrotécnica (DEE) do Instituto Superior de Engenharia do Porto (ISEP). O projeto escolhido teve como base o uso da tecnologia das redes neuronais para implementação em sistemas de controlo. Foi necessário primeiro realizar um estudo desta tecnologia, perceber como esta surgiu e como é estruturada. Por último, abordar alguns casos de estudo onde as redes neuronais foram aplicadas com sucesso. Relativamente à implementação, foram consideradas diferentes estruturas de controlo, e entre estas escolhidas a do sistema de controlo estabilizador e sistema de referência adaptativo. No entanto, como o objetivo deste trabalho é o estudo de desempenho quando aplicadas as redes neuronais, não se utilizam apenas estas como controlador. A análise exposta neste trabalho trata de perceber em que medida é que a introdução das redes neuronais melhora o controlo de um processo. Assim sendo, os sistemas de controlo utilizados devem conter pelo menos uma rede neuronal e um controlador PID. Os testes de desempenho são aplicados no controlo de um motor DC, sendo realizados através do recurso ao software MATLAB. As simulações efetuadas têm diferentes configurações de modo a tirar conclusões o mais gerais possível. Assim, os sistemas de controlo são simulados para dois tipos de entrada diferentes, e com ou sem a adição de ruído no sensor. Por fim, é efetuada uma análise das respostas de cada sistema implementado e calculados os índices de desempenho das mesmas.
Resumo:
To assure enduring success, firms need to generate economic value with respect for the environment and social value. They also need to be aware of the needs and expectations of relevant stakeholders and incorporate them in their business strategies and programs. These challenges imply that engineers should take into consideration societal, health and safety,environmental and commercial issues in their professional activity. This investigation accesses the influence of firms’ environmental management programs and community involvement programs on their own employees and in the community, with a focus on small and medium companies. Based on a quantitative research, the findings suggest that firms that invest both in environmental management programs and in community involvement programs have a higher involvement of their own employees with the community, while at the same time receiving more feedback (positive, but also negative) from the community, stressing the need to pay special attention to their communication policies.
Resumo:
Smart Grids (SGs) have emerged as the new paradigm for power system operation and management, being designed to include large amounts of distributed energy resources. This new paradigm requires new Energy Resource Management (ERM) methodologies considering different operation strategies and the existence of new management players such as several types of aggregators. This paper proposes a methodology to facilitate the coalition between distributed generation units originating Virtual Power Players (VPP) considering a game theory approach. The proposed approach consists in the analysis of the classifications that were attributed by each VPP to the distributed generation units, as well as in the analysis of the previous established contracts by each player. The proposed classification model is based in fourteen parameters including technical, economical and behavioural ones. Depending of the VPP strategies, size and goals, each parameter has different importance. VPP can also manage other type of energy resources, like storage units, electric vehicles, demand response programs or even parts of the MV and LV distribution network. A case study with twelve VPPs with different characteristics and one hundred and fifty real distributed generation units is included in the paper.
Resumo:
Further improvements in demand response programs implementation are needed in order to take full advantage of this resource, namely for the participation in energy and reserve market products, requiring adequate aggregation and remuneration of small size resources. The present paper focuses on SPIDER, a demand response simulation that has been improved in order to simulate demand response, including realistic power system simulation. For illustration of the simulator’s capabilities, the present paper is proposes a methodology focusing on the aggregation of consumers and generators, providing adequate tolls for the demand response program’s adoption by evolved players. The methodology proposed in the present paper focuses on a Virtual Power Player that manages and aggregates the available demand response and distributed generation resources in order to satisfy the required electrical energy demand and reserve. The aggregation of resources is addressed by the use of clustering algorithms, and operation costs for the VPP are minimized. The presented case study is based on a set of 32 consumers and 66 distributed generation units, running on 180 distinct operation scenarios.
Resumo:
In this paper, we formulate the electricity retailers’ short-term decision-making problem in a liberalized retail market as a multi-objective optimization model. Retailers with light physical assets, such as generation and storage units in the distribution network, are considered. Following advances in smart grid technologies, electricity retailers are becoming able to employ incentive-based demand response (DR) programs in addition to their physical assets to effectively manage the risks of market price and load variations. In this model, the DR scheduling is performed simultaneously with the dispatch of generation and storage units. The ultimate goal is to find the optimal values of the hourly financial incentives offered to the end-users. The proposed model considers the capacity obligations imposed on retailers by the grid operator. The profit seeking retailer also has the objective to minimize the peak demand to avoid the high capacity charges in form of grid tariffs or penalties. The non-dominated sorting genetic algorithm II (NSGA-II) is used to solve the multi-objective problem. It is a fast and elitist multi-objective evolutionary algorithm. A case study is solved to illustrate the efficient performance of the proposed methodology. Simulation results show the effectiveness of the model for designing the incentive-based DR programs and indicate the efficiency of NSGA-II in solving the retailers’ multi-objective problem.
Resumo:
Os primeiros trabalhos sobre Computer-Supported Cooperative Work surgiram na segunda metade da década de 80, estabelecendo-se um campo de investigação interdisciplinar com enfoque no papel do computador e das tecnologias da comunicação no apoio do trabalho em grupo (Ishii et al., 1994). Ao abordar esta área de investigação torna-se claro que é necessário ter em conta a diversidade dos grupos e das tarefas que estes devem de utilizar, entre outros factores importantes. As implicações desta diversidade são discutidas ao nível concepção de interfaces de groupware, em que um maior envolvimento dos utilizadores nas fases iniciais parece ser necessário, e ao nível dos Sistemas de Apoio à Decisão em Grupo.
Resumo:
Based on the report for the “Project III” unit of the PhD programme on Technology Assessment under the supervision of Prof. António B. Moniz. This report was discussed also at the 2nd Winter School on Technology Assessment held at Universidade Nova de Lisboa, Caparica Campus, Portugal on December 2011.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Computação gráfica um campo que tem vindo a crescer bastante nos últimos anos, desde áreas como cinematográficas, dos videojogos, da animação, o avanço tem sido tão grande que a semelhança com a realidade é cada vez maior. Praticamente hoje em dia todos os filmes têm efeitos gerados através de computação gráfica, até simples anúncios de televisão para não falar do realismo dos videojogos de hoje. Este estudo tem como objectivo mostrar duas alternativas no mundo da computação gráfica, como tal, vão ser usados dois programas, Blender e Unreal Engine. O cenário em questão será todo modelado de raiz e será o mesmo nos dois programas. Serão feitos vários renders ao cenário, em ambos os programas usando diferentes materiais, diferentes tipos de iluminação, em tempo real e não de forma a mostrar as várias alternativas possíveis.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática