928 resultados para Residual-Based Cointegration Test
Resumo:
The aims of this study were to 1) determine the relationship between performance on the court-based TIVRE-Basket® test and peak aerobic power determined from a criterion lab-based incremental treadmill test and 2) to examine the test-retest reliability of the TIVRE-Basket® test in elite male basketball players. To address aim 1, 36 elite male basketball players (age 25.2 + 4.7 years, weight 94.1 + 11.4 kg, height 195.83 + 9.6 cm) completed a graded treadmill exercise test and the TIVRE-Basket® within 72 hours. Mean distance recorded during the TIVRE-Basket® test was 4001.8 + 176.4m, and mean VO2 peak was 54.7 + 2.8 ml.kg.min-1, and the correlation between the two parameters was r=0.824 (P= <0.001). Linear regression analysis identified TIVRE-Basket® distance (m) as the only unique predictor of VO2 peak in a single variable plus constant model: VO2 peak = 2.595 + ((0.13* TIVRE-Basket® distance (m)). Performance on the TIVRE-Basket® test accounted for 67.8% of the variance in VO2 peak (t=8.466, P=<.001, 95% CI 0.01 - 0.016, SEE 1.61). To address aim 2, 20 male basketball players (age 26.7±4.2; height 1.94±0.92; weight 94.0±9.1) performed the TIVRE-Basket® test on two occasions. There was no significant difference in total distance covered between Trial 1 (4138.8 + 677.3m) and Trial 2 (4188.0 + 648.8m; t = 0.5798, P = 0.5688). Mean difference between trials was 49.2 + 399.5m, with an ICC of 0.85 suggesting a moderate level of reliability. Standardised TEM was 0.88%, representing a moderate degree of trial to trial error, and the CV was 6.3%. The TIVRE-Basket® test therefore represents a valid and moderately reliable court-based sport-specific test of aerobic power for use with individuals and teams of elite level male basketball players. Future research is required to ascertain its validity and reliability in other basketball populations e.g. across age groups, at different levels of competition, in females and in different forms of the game e.g. wheelchair basketball.
Resumo:
Tese de mestrado em Química Tecnológica, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2016
Resumo:
Thesis (Ph.D.)--University of Washington, 2015
Resumo:
Congestion management of transmission power systems has achieve high relevance in competitive environments, which require an adequate approach both in technical and economic terms. This paper proposes a new methodology for congestion management and transmission tariff determination in deregulated electricity markets. The congestion management methodology is based on a reformulated optimal power flow, whose main goal is to obtain a feasible solution for the re-dispatch minimizing the changes in the transactions resulting from market operation. The proposed transmission tariffs consider the physical impact caused by each market agents in the transmission network. The final tariff considers existing system costs and also costs due to the initial congestion situation and losses. This paper includes a case study for the 118 bus IEEE test case.
Resumo:
In recent decades, all over the world, competition in the electric power sector has deeply changed the way this sector’s agents play their roles. In most countries, electric process deregulation was conducted in stages, beginning with the clients of higher voltage levels and with larger electricity consumption, and later extended to all electrical consumers. The sector liberalization and the operation of competitive electricity markets were expected to lower prices and improve quality of service, leading to greater consumer satisfaction. Transmission and distribution remain noncompetitive business areas, due to the large infrastructure investments required. However, the industry has yet to clearly establish the best business model for transmission in a competitive environment. After generation, the electricity needs to be delivered to the electrical system nodes where demand requires it, taking into consideration transmission constraints and electrical losses. If the amount of power flowing through a certain line is close to or surpasses the safety limits, then cheap but distant generation might have to be replaced by more expensive closer generation to reduce the exceeded power flows. In a congested area, the optimal price of electricity rises to the marginal cost of the local generation or to the level needed to ration demand to the amount of available electricity. Even without congestion, some power will be lost in the transmission system through heat dissipation, so prices reflect that it is more expensive to supply electricity at the far end of a heavily loaded line than close to an electric power generation. Locational marginal pricing (LMP), resulting from bidding competition, represents electrical and economical values at nodes or in areas that may provide economical indicator signals to the market agents. This article proposes a data-mining-based methodology that helps characterize zonal prices in real power transmission networks. To test our methodology, we used an LMP database from the California Independent System Operator for 2009 to identify economical zones. (CAISO is a nonprofit public benefit corporation charged with operating the majority of California’s high-voltage wholesale power grid.) To group the buses into typical classes that represent a set of buses with the approximate LMP value, we used two-step and k-means clustering algorithms. By analyzing the various LMP components, our goal was to extract knowledge to support the ISO in investment and network-expansion planning.
Resumo:
Adequate decision support tools are required by electricity market players operating in a liberalized environment, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services (AS) represent a good negotiation opportunity that must be considered by market players. Based on the ancillary services forecasting, market participants can use strategic bidding for day-ahead ancillary services markets. For this reason, ancillary services market simulation is being included in MASCEM, a multi-agent based electricity market simulator that can be used by market players to test and enhance their bidding strategies. The paper presents the methodology used to undertake ancillary services forecasting, based on an Artificial Neural Network (ANN) approach. ANNs are used to day-ahead prediction of non-spinning reserve (NS), regulation-up (RU), and regulation down (RD). Spinning reserve (SR) is mentioned as past work for comparative analysis. A case study based on California ISO (CAISO) data is included; the forecasted results are presented and compared with CAISO published forecast.
Resumo:
Involving groups in important management processes such as decision making has several advantages. By discussing and combining ideas, counter ideas, critical opinions, identified constraints, and alternatives, a group of individuals can test potentially better solutions, sometimes in the form of new products, services, and plans. In the past few decades, operations research, AI, and computer science have had tremendous success creating software systems that can achieve optimal solutions, even for complex problems. The only drawback is that people don’t always agree with these solutions. Sometimes this dissatisfaction is due to an incorrect parameterization of the problem. Nevertheless, the reasons people don’t like a solution might not be quantifiable, because those reasons are often based on aspects such as emotion, mood, and personality. At the same time, monolithic individual decisionsupport systems centered on optimizing solutions are being replaced by collaborative systems and group decision-support systems (GDSSs) that focus more on establishing connections between people in organizations. These systems follow a kind of social paradigm. Combining both optimization- and socialcentered approaches is a topic of current research. However, even if such a hybrid approach can be developed, it will still miss an essential point: the emotional nature of group participants in decision-making tasks. We’ve developed a context-aware emotion based model to design intelligent agents for group decision-making processes. To evaluate this model, we’ve incorporated it in an agent-based simulator called ABS4GD (Agent-Based Simulation for Group Decision), which we developed. This multiagent simulator considers emotion- and argument based factors while supporting group decision-making processes. Experiments show that agents endowed with emotional awareness achieve agreements more quickly than those without such awareness. Hence, participant agents that integrate emotional factors in their judgments can be more successful because, in exchanging arguments with other agents, they consider the emotional nature of group decision making.
Resumo:
Os solos residuais mostram divergências em relação aos solos transportados modelados pelas teorias da Mecânica dos Solos. Estas divergências são em grande parte devido a uma estrutura de cimentação herdada da rocha mãe. Este estudo foi baseado nos resultados obtidos em sondagens mecânicas e ensaios de penetração dinâmica, estática e laboratoriais e consistiu na avaliação e correlação dos parâmetros que determinam o comportamento geomecânico do terreno, como a resistência e a deformabilidade.
Resumo:
Ancillary services represent a good business opportunity that must be considered by market players. This paper presents a new methodology for ancillary services market dispatch. The method considers the bids submitted to the market and includes a market clearing mechanism based on deterministic optimization. An Artificial Neural Network is used for day-ahead prediction of Regulation Down, regulation-up, Spin Reserve and Non-Spin Reserve requirements. Two test cases based on California Independent System Operator data concerning dispatch of Regulation Down, Regulation Up, Spin Reserve and Non-Spin Reserve services are included in this paper to illustrate the application of the proposed method: (1) dispatch considering simple bids; (2) dispatch considering complex bids.
Resumo:
This paper proposes a dynamic scheduler that supports the coexistence of guaranteed and non-guaranteed bandwidth servers to efficiently handle soft-tasks’ overloads by making additional capacity available from two sources: (i) residual capacity allocated but unused when jobs complete in less than their budgeted execution time; (ii) stealing capacity from inactive non-isolated servers used to schedule best-effort jobs. The effectiveness of the proposed approach in reducing the mean tardiness of periodic jobs is demonstrated through extensive simulations. The achieved results become even more significant when tasks’ computation times have a large variance.
Resumo:
The characteristics of carbon fibre reinforced laminates had widened their use, from aerospace to domestic appliances. A common characteristic is the need of drilling for assembly purposes. It is known that a drilling process that reduces the drill thrust force can decrease the risk of delamination. In this work, delamination assessment methods based on radiographic data are compared and correlated with mechanical test results (bearing test).
Resumo:
Renewable energy sources (RES) have unique characteristics that grant them preference in energy and environmental policies. However, considering that the renewable resources are barely controllable and sometimes unpredictable, some challenges are faced when integrating high shares of renewable sources in power systems. In order to mitigate this problem, this paper presents a decision-making methodology regarding renewable investments. The model computes the optimal renewable generation mix from different available technologies (hydro, wind and photovoltaic) that integrates a given share of renewable sources, minimizing residual demand variability, therefore stabilizing the thermal power generation. The model also includes a spatial optimization of wind farms in order to identify the best distribution of wind capacity. This methodology is applied to the Portuguese power system.
Resumo:
The increasing complexity of VLSI circuits and the reduced accessibility of modern packaging and mounting technologies restrict the usefulness of conventional in-circuit debugging tools, such as in-circuit emulators for microprocessors and microcontrollers. However, this same trend enables the development of more complex products, which in turn require more powerful debugging tools. These conflicting demands could be met if the standard scan test infrastructures now common in most complex components were able to match the debugging requirements of design verification and prototype validation. This paper analyses the main debug requirements in the design of microprocessor-based applications and the feasibility of their implementation using the mandatory, optional and additional operating modes of the standard IEEE 1149.1 test infrastructure.
Resumo:
Weblabs are spreading their influence in Science and Engineering (S&E) courses providing a way to remotely conduct real experiments. Typically, they are implemented by different architectures and infrastructures supported by Instruments and Modules (I&Ms) able to be remotely controlled and observed. Besides the inexistence of a standard solution for implementing weblabs, their reconfiguration is limited to a setup procedure that enables interconnecting a set of preselected I&Ms into an Experiment Under Test (EUT). Moreover, those I&Ms are not able to be replicated or shared by different weblab infrastructures, since they are usually based on hardware platforms. Thus, to overcome these limitations, this paper proposes a standard solution that uses I&Ms embedded into Field-Programmable Gate Array (FPGAs) devices. It is presented an architecture based on the IEEE1451.0 Std. supported by a FPGA-based weblab infrastructure able to be remotely reconfigured with I&Ms, described through standard Hardware Description Language (HDL) files, using a Reconfiguration Tool (RecTool).
Resumo:
Dissertação para a obtenção do grau de Mestre em Engenharia Electrotécnica Ramo de Energia