878 resultados para Condition Monitoring, Asset Management, Maintenance, Low Speed Machinery, Diagnostics
Resumo:
The increase in the efficiency of photo-voltaic systems has been the object of various studies the past few years. One possible way to increase the power extracted by a photovoltaic panel is the solar tracking, performing its movement in order to follow the sun’s path. One way to activate the tracking system is using an electric induction motor, which should have sufficient torque and low speed, ensuring tracking accuracy. With the use of voltage source inverters and logic devices that generate the appropriate switching is possible to obtain the torque and speed required for the system to operate. This paper proposes the implementation of a angular position sensor and a driver to be applied in solar tracker built at a Power Electronics and Renewable Energies Laboratory, located in UFRN. The speed variation of the motor is performed via a voltage source inverter whose PWM command to actuate their keys will be implemented in an FPGA (Field Programmable Gate Array) device and a TM4C microcontroller. A platform test with an AC induction machine of 1.5 CV was assembled for the comparative testing. The angular position sensor of the panel is implemented in a ATMega328 microcontroller coupled to an accelerometer, commanded by an Arduino prototyping board. The solar position is also calculated by the microcontroller from the geographic coordinates of the site where it was placed, and the local time and date obtained from an RTC (Real-Time Clock) device. A prototype of a solar tracker polar axis moved by a DC motor was assembled to certify the operation of the sensor and to check the tracking efficiency.
Resumo:
Date of Acceptance: 13/03/2015
Resumo:
Date of Acceptance: 13/03/2015
Resumo:
The drag on a nacelle model was investigated experimentally and computationally to provide guidance and insight into the capabilities of RANS-based CFD. The research goal was to determine whether industry constrained CFD could participate in the aerodynamic design of nacelle bodies. Grid refinement level, turbulence model and near wall treatment settings, to predict drag to the highest accuracy, were key deliverables. Cold flow low-speed wind tunnel experiments were conducted at a Reynolds number of 6∙〖10〗^5, 293 K and a Mach number of 0.1. Total drag force was measured by a six-component force balance. Detailed wake analysis, using a seven-hole pressure probe traverse, allowed for drag decomposition via the far-field method. Drag decomposition was performed through a range of angles of attack between 0o and 45o. Both methods agreed on total drag within their respective uncertainties. Reversed flow at the measurement plane and saturation of the load cell caused discrepancies at high angles of attack. A parallel CFD study was conducted using commercial software, ICEM 15.0 and FLUENT 15.0. Simulating a similar nacelle geometry operating under inlet boundary conditions obtained through wind tunnel characterization allowed for direct comparisons with experiment. It was determined that the Realizable k-ϵ was best suited for drag prediction of this geometry. This model predicted the axial momentum loss and secondary flow in the wake, as well as the integrated surface forces, within experimental error up to 20o angle of attack. SST k-ω required additional surface grid resolution on the nacelle suction side, resulting in 15% more elements, due to separation point prediction sensitivity. It was further recommended to apply enhanced wall treatment to more accurately capture the viscous drag and separated flow structures. Overall, total drag was predicted within 5% at 0o angle of attack and 10% at 20o, each within experimental uncertainty. What is more, the form and induced drag predicted by CFD and measured by the wake traverse shared good agreement. Which indicated CFD captured the key flow features accurately despite simplification of the nacelle interior geometry.
Resumo:
The need to steer economic development has always been great and as management model has the balanced scorecard has been popular since the mid- 1990s, mainly in the private sector but also in the municipal sector. The introduction of the balanced scorecard has been primarily to organizations to see more than economic dimensions. The Balanced Scorecard was originally a measurement system, and today it works more as a strategic instrument. In our study is a case study to evaluate a municipality and how they make use of the balanced scorecard as a tool for strategic and value-adding work in municipal activities. In the local business is it important that the organization adapts the balanced scorecard, so it fits on the basis that it is a politically driven organization, with mandates, committees and administrations. In our study, we used a qualitative method with a deductive approach. In the study, we have gathered information through a case study where we interviewed 7 people in leading positions. In our analysis and results section, we came to the conclusion that the municipality does not use the balanced scorecard correctly. We also found that the balanced scorecard as a tool for value creation and strategic planning does not work in a favorable way. In our study, we see difficulties with the implementation of the balanced scorecard. If the municipality has invested in implementing the balanced scorecard at all levels of the business so the municipality would be able to use it on one of the activities more adequately. When the municipality is a politically driven organization, it is important that vision alive and changing based on the conditions that reflect the outside world and the municipality in general. Looking at a vivid vision, goals and business ideas, it's balanced scorecard in line with how a balanced scorecard should look like. The municipality has a strategic plan in terms of staff and employees at large. In the study, we have seen that the strategic plan is not followed up in a good way and for the business favorably, the municipality chooses the easy way out for evaluation. Employee participation to changes and ongoing human resources management feels nonexistent. However, as has been the vision of creating empowered and motivated employees. In our conclusion, we describe how we in our study look at the use of the balanced scorecard in municipal operations. We can also discern that a balanced scorecard as a tool for value creation and strategic work is good if it is used properly. In the study, we have concluded that the municipality we have chosen to study should not use the balanced scorecard when you have not created the tools and platforms required for employees, civil servants and politicians to evaluate, monitor and create a living scorecard change over time. The study reveals major shortcomings in the implementation, evaluation and follow-up possibilities, and the consequence of this is that the balanced scorecard is not - 4 - preferable in municipal operations as a strategic instrument for value creation and long-term planning.
Resumo:
[Excerpt] In a recent public relations document, the New York Stock Exchange defines its mission statement as to: “Support the capital-raising and asset management process by providing the highest quality and most cost-effective, self-regulated marketplace for the trading of financial instruments.” The common thread that runs through this and similar statements made by organized financial markets from Frankfurt to Tokyo is that they hold as their primary goals to help companies raise capital and to provide a liquid and efficient aftermarket for those securities.
Resumo:
Software Architecture is a high level description of a software intensive system that enables architects to have a better intellectual control over the complete system. It is also used as a communication vehicle among the various system stakeholders. Variability in software-intensive systems is the ability of a software artefact (e.g., a system, subsystem, or component) to be extended, customised, or configured for deployment in a specific context. Although variability in software architecture is recognised as a challenge in multiple domains, there has been no formal consensus on how variability should be captured or represented. In this research, we addressed the problem of representing variability in software architecture through a three phase approach. First, we examined existing literature using the Systematic Literature Review (SLR) methodology, which helped us identify the gaps and challenges within the current body of knowledge. Equipped with the findings from the SLR, a set of design principles have been formulated that are used to introduce variability management capabilities to an existing Architecture Description Language (ADL). The chosen ADL was developed within our research group (ALI) and to which we have had complete access. Finally, we evaluated the new version of the ADL produced using two distinct case studies: one from the Information Systems domain, an Asset Management System (AMS); and another from the embedded systems domain, a Wheel Brake System (WBS). This thesis presents the main findings from the three phases of the research work, including a comprehensive study of the state-of-the-art; the complete specification of an ADL that is focused on managing variability; and the lessons learnt from the evaluation work of two distinct real-life case studies.
Resumo:
Internet users consume online targeted advertising based on information collected about them and voluntarily share personal information in social networks. Sensor information and data from smart-phones is collected and used by applications, sometimes in unclear ways. As it happens today with smartphones, in the near future sensors will be shipped in all types of connected devices, enabling ubiquitous information gathering from the physical environment, enabling the vision of Ambient Intelligence. The value of gathered data, if not obvious, can be harnessed through data mining techniques and put to use by enabling personalized and tailored services as well as business intelligence practices, fueling the digital economy. However, the ever-expanding information gathering and use undermines the privacy conceptions of the past. Natural social practices of managing privacy in daily relations are overridden by socially-awkward communication tools, service providers struggle with security issues resulting in harmful data leaks, governments use mass surveillance techniques, the incentives of the digital economy threaten consumer privacy, and the advancement of consumergrade data-gathering technology enables new inter-personal abuses. A wide range of fields attempts to address technology-related privacy problems, however they vary immensely in terms of assumptions, scope and approach. Privacy of future use cases is typically handled vertically, instead of building upon previous work that can be re-contextualized, while current privacy problems are typically addressed per type in a more focused way. Because significant effort was required to make sense of the relations and structure of privacy-related work, this thesis attempts to transmit a structured view of it. It is multi-disciplinary - from cryptography to economics, including distributed systems and information theory - and addresses privacy issues of different natures. As existing work is framed and discussed, the contributions to the state-of-theart done in the scope of this thesis are presented. The contributions add to five distinct areas: 1) identity in distributed systems; 2) future context-aware services; 3) event-based context management; 4) low-latency information flow control; 5) high-dimensional dataset anonymity. Finally, having laid out such landscape of the privacy-preserving work, the current and future privacy challenges are discussed, considering not only technical but also socio-economic perspectives.
Resumo:
"Retención del talento humano en tiempos de cambio” tiene como principal objetivo encontrar los factores más importantes en este aspecto que tienen nueve empresas representativas de Antioquia con presencia en Colombia y otros lugares de la geografía mundial; además, abordar estas empresas para conocer el estado actual de los procesos de gestión del talento humano como motor para fortalecer la retención y encontrar tendencias en las prácticas de estas organizaciones -- Se parte de una revisión bibliográfica, que considera variables intrínsecas y extrínsecas (que analizan los impactos hacia el ser y el hacer, respectivamente) -- Posteriormente se realizan entrevistas con los líderes de gestión humana de esas empresas, de las cuales se puede concluir que: primero, la gestión continua y de muchos años para fortalecer el liderazgo de los jefes redunda en un clima laboral que permite la retención de sus trabajadores; segundo, es importante permitir al trabajador desarrollarse dentro de la organización entendiendo la transición que empiezan a afrontar estas empresas por la necesidad de contratar jóvenes a los que motivan la alta exposición y el asumir retos evitando trabajos monótonos y repetitivos; tercero, se deben considerar aspectos como la diversidad y la inclusión, los beneficios que ofrece la empresa, y los paquetes de beneficios que cada día son más dinámicos e intentan ser totalmente flexibles para todos los trabajadores, y en menor escala el factor salarial -- Se hace claridad en que cada una de las compañías abordadas ofrecen a sus trabajadores salarios competitivos acordes a los cargos, roles y el sector en el cual trabajan, además de un salario emocional que los impacta directamente y también beneficia sus familias -- El presente estudio se realizó sobre una población de nueve empresas: Grupo Argos, Grupo Bancolombia, Grupo EPM, Organización Corona, Protección, Servicios Nutresa, Sura Asset Management, Sofasa-Renault y la Universidad Eafit -- Todas ellas generan un impacto alto en la tasa de empleo de la ciudad de Medellín y de Colombia, son altamente reconocidas en Colombia, América Latina y otras partes del mundo
Resumo:
One of the most disputable matters in the theory of finance has been the theory of capital structure. The seminal contributions of Modigliani and Miller (1958, 1963) gave rise to a multitude of studies and debates. Since the initial spark, the financial literature has offered two competing theories of financing decision: the trade-off theory and the pecking order theory. The trade-off theory suggests that firms have an optimal capital structure balancing the benefits and costs of debt. The pecking order theory approaches the firm capital structure from information asymmetry perspective and assumes a hierarchy of financing, with firms using first internal funds, followed by debt and as a last resort equity. This thesis analyses the trade-off and pecking order theories and their predictions on a panel data consisting 78 Finnish firms listed on the OMX Helsinki stock exchange. Estimations are performed for the period 2003–2012. The data is collected from Datastream system and consists of financial statement data. A number of capital structure characteristics are identified: firm size, profitability, firm growth opportunities, risk, asset tangibility and taxes, speed of adjustment and financial deficit. A regression analysis is used to examine the effects of the firm characteristics on capitals structure. The regression models were formed based on the relevant theories. The general capital structure model is estimated with fixed effects estimator. Additionally, dynamic models play an important role in several areas of corporate finance, but with the combination of fixed effects and lagged dependent variables the model estimation is more complicated. A dynamic partial adjustment model is estimated using Arellano and Bond (1991) first-differencing generalized method of moments, the ordinary least squares and fixed effects estimators. The results for Finnish listed firms show support for the predictions of profitability, firm size and non-debt tax shields. However, no conclusive support for the pecking-order theory is found. However, the effect of pecking order cannot be fully ignored and it is concluded that instead of being substitutes the trade-off and pecking order theory appear to complement each other. For the partial adjustment model the results show that Finnish listed firms adjust towards their target capital structure with a speed of 29% a year using book debt ratio.
Resumo:
The aim of this thesis was threefold, firstly, to compare current player tracking technology in a single game of soccer. Secondly, to investigate the running requirements of elite women’s soccer, in particular the use and application of athlete tracking devices. Finally, how can game style be quantified and defined. Study One compared four different match analysis systems commonly used in both research and applied settings: video-based time-motion analysis, a semi-automated multiple camera based system, and two commercially available Global Positioning System (GPS) based player tracking systems at 1 Hertz (Hz) and 5 Hz respectively. A comparison was made between each of the systems when recording the same game. Total distance covered during the match for the four systems ranged from 10 830 ± 770 m (semi-automated multiple camera based system) to 9 510 ± 740m (video-based time-motion analysis). At running speeds categorised as high-intensity running (>15 km⋅h-1), the semi-automated multiple camera based system reported the highest distance of 2 650 ± 530 m with video-based time-motion analysis reporting the least amount of distance covered with 1 610 ± 370 m. At speeds considered to be sprinting (>20 km⋅h-1), the video-based time-motion analysis reported the highest value (420 ± 170 m) and 1 Hz GPS units the lowest value (230 ± 160 m). These results demonstrate there are differences in the determination of the absolute distances, and that comparison of results between match analysis systems should be made with caution. Currently, there is no criterion measure for these match analysis methods and as such it was not possible to determine if one system was more accurate than another. Study Two provided an opportunity to apply player-tracking technology (GPS) to measure activity profiles and determine the physical demands of Australian international level women soccer players. In four international women’s soccer games, data was collected on a total of 15 Australian women soccer players using a 5 Hz GPS based athlete tracking device. Results indicated that Australian women soccer players covered 9 140 ± 1 030 m during 90 min of play. The total distance covered by Australian women was less than the 10 300 m reportedly covered by female soccer players in the Danish First Division. However, there was no apparent difference in the estimated "#$%&', as measured by multi-stage shuttle tests, between these studies. This study suggests that contextual information, including the “game style” of both the team and opposition may influence physical performance in games. Study Three examined the effect the level of the opposition had on the physical output of Australian women soccer players. In total, 58 game files from 5 Hz athlete-tracking devices from 13 international matches were collected. These files were analysed to examine relationships between physical demands, represented by total distance covered, high intensity running (HIR) and distances covered sprinting, and the level of the opposition, as represented by the Fédération Internationale de Football Association (FIFA) ranking at the time of the match. Higher-ranking opponents elicited less high-speed running and greater low-speed activity compared to playing teams of similar or lower ranking. The results are important to coaches and practitioners in the preparation of players for international competition, and showed that the differing physical demands required were dependent on the level of the opponents. The results also highlighted the need for continued research in the area of integrating contextual information in team sports and demonstrated that soccer can be described as having dynamic and interactive systems. The influence of playing strategy, tactics and subsequently the overall game style was highlighted as playing a significant part in the physical demands of the players. Study Four explored the concept of game style in field sports such as soccer. The aim of this study was to provide an applied framework with suggested metrics for use by coaches, media, practitioners and sports scientists. Based on the findings of Studies 1- 3 and a systematic review of the relevant literature, a theoretical framework was developed to better understand how a team’s game style could be quantified. Soccer games can be broken into key moments of play, and for each of these moments we categorised metrics that provide insight to success or otherwise, to help quantify and measure different methods of playing styles. This study highlights that to date, there had been no clear definition of game style in team sports and as such a novel definition of game style is proposed that can be used by coaches, sport scientists, performance analysts, media and general public. Studies 1-3 outline four common methods of measuring the physical demands in soccer: video based time motion analysis, GPS at 1 Hz and at 5 Hz and semiautomated multiple camera based systems. As there are no semi-automated multiple camera based systems available in Australia, primarily due to cost and logistical reasons, GPS is widely accepted for use in team sports in tracking player movements in training and competition environments. This research identified that, although there are some limitations, GPS player-tracking technology may be a valuable tool in assessing running demands in soccer players and subsequently contribute to our understanding of game style. The results of the research undertaken also reinforce the differences between methods used to analyse player movement patterns in field sports such as soccer and demonstrate that the results from different systems such as GPS based athlete tracking devices and semi-automated multiple camera based systems cannot be used interchangeably. Indeed, the magnitude of measurement differences between methods suggests that significant measurement error is evident. This was apparent even when the same technologies are used which measure at different sampling rates, such as GPS systems using either 1 Hz or 5 Hz frequencies of measurement. It was also recognised that other factors influence how team sport athletes behave within an interactive system. These factors included the strength of the opposition and their style of play. In turn, these can impact the physical demands of players that change from game to game, and even within games depending on these contextual features. Finally, the concept of what is game style and how it might be measured was examined. Game style was defined as "the characteristic playing pattern demonstrated by a team during games. It will be regularly repeated in specific situational contexts such that measurement of variables reflecting game style will be relatively stable. Variables of importance are player and ball movements, interaction of players, and will generally involve elements of speed, time and space (location)".
Resumo:
Wydział Biologii
Resumo:
In Australia, along with many other parts of the world, fumigation with phosphine is a vital component in controlling stored grain insect pests. However, resistance is a factor that may limit the continued efficacy of this fumigant. While strong resistance to phosphine has been identified and characterised, very little information is available on the causes of its development and spread. Data obtained from a unique national resistance monitoring and management program were analysed, using Bayesian hurdle modelling, to determine which factors may be responsible. Fumigation in unsealed storages, combined with a high frequency of weak resistance, were found to be the main criteria that led to the development of strong resistance in Sitophilus oryzae. Independent development, rather than gene flow via migration, appears to be primarily responsible for the geographic incidence of strong resistance to phosphine in S. oryzae. This information can now be utilised to direct resources and education into those areas at high risk and to refine phosphine resistance management strategies.
Resumo:
International audience